Showing posts with label Computers. Show all posts
Showing posts with label Computers. Show all posts

Wednesday, October 7, 2015

Debunking a dangerous meme

While I applaud the publicity for the efforts of a local-ish (to me) researcher to study the patterns of computer hackers, I always cringe when press articles focus on money and personal info.

No question that getting into your bank account is a valuable thing for a hacker.  Stealing your identity is theft one step removed:  The hacker (or thief who buys the info. from said hacker) aims to steal your identity to steal from others.

My point is that it doesn't end there. 

To wit:  You could be flat, busted broke, and have a negative credit score, and hackers will still be interested in you as long as you have a functioning computer connected to the internet.

Here are just a few ways you can still be victimised, even when you think that you are "safe" because you're not Warren Buffet:

Spam - the original flavour.  All the email addresses in your contacts list?  Those can be stolen and spammed.  I'm sure your Mom, your boss, and/or your BFF will all appreciate that...

Spam - now with new and improved Sleaze Factor(TM).   If the hacker (or hacker's client) isn't spamming your friends with dodgy V1@gr@ or Nigerian Prince come-ons, they're trying to trick your contacts into infecting their own computers.  (War story:  The one and only virus that happened on my 2-year SysAdmin gig happened because a normally vigilant someone expected an email with an attachment and double-clicked it.  Bottom line:  It can happen to anyone.  And I mean anyone.)

Spam - the social media version.  If you have a social media account, those passwords can be sniffed and stolen.  Love those sleazy DMs you sometimes get in Twitter or Facebook that are followed by an embarrassed apology from a friend who's just wrested back control over their account?  Yeah, me neither.  Want to be the one making those apologies to your aunt?  Me neither.

Borgifying your computerYou may never think you have a fast enough CPU or half the RAM you could use.  But believe-you-me, you have more than enough power to (unwittingly) help someone mine Bitcoins.  And, holy moly, if you think your computer is slow now...

Borgifying your computer + bogarting your bandwidth.  Remember the distributed denial of service (DDoS) attack that nearly took down the XBox network last Christmas?  That dick move was brought to the world not only by hackers, but by hordes of infected computers (otherwise known as a "botnet.")  Also, remember all that spam we were just talking about?  Yeah, that's likely being pumped through compromised computers as well.  Did Netflix streaming just sputter out?  Oh, your ISP just billed you b/c you went over your monthly bandwidth ration?  Sucks to be you...not to mention everyone else on the receiving end of your computer's shenanigans.

So, it could be just me being cynical about the human race (see afore-mentioned SysAdmin stint), but the whole "I don't have anything to hack" meme is being used as an excuse not to keep computers patched.  And that, even a decade down the road from babysitting networks, just pisses me off.

As much as I despise the codified knee-jerk hysteria that masquerades as cybersecurity legislation, sometimes I wish that people could be legally barred from having admin. rights on their own computers after Computing While Lazy.  Because when our digital lives are eating so deeply into our meatspace time, responsibility comes with the power to instantaneously connect with people all over the planet. 

Monday, July 13, 2015

In scripto veritas*

The last few (work)days have been a blast back to my DOS past--at least in the sense of being very script-oriented.  (The similarities end there.  My first computer, an 8-bit Epson 8088, was powered by two 5.25" floppy drives and was certainly not networked.) I've had the luxury of a web-based interface to the MySQL database, but otherwise all interaction with this new server has been via SSH, SCP, and SFTP.  White text on the black background of a command-prompt, in other words.

Threading through the maze of branches of a file system (in the case of SFTP, on the client- as well as server-side) definitely requires a bit more presence of mind than a "windowed" UI.  Particularly when you need to make sure you're not uploading Beta code to a Production system...or vice-versa. Then there's the fact that misspelling anything will generate an error message.  And in a Unix-based system, that means even perfectly matching the capitalisation.

But oddly, the lack of a pretty UI over the top of those interactions is oddly reassuring.  When I launch a script via a command-line and collect its output from a log file, I feel like I can trust what it's telling me.  Ditto when keeping an eye on the system processes after a scheduled (a.k.a. "cron") job launches.  Maybe it's just that I was "raised" on the command-line.  Or maybe it's the fact that *nix doesn't try to protect its users from their own fumble-fingering or short attention spans.

Either way, I appreciate the straight-talk.  Normally, I distrust truth rendered in terms of black-and-white.  But this is definitely an exception.

- - - - -

* I'm riffing on the Latin "in vino veritas," which translates as, "In wine, [there is] truth." 

Monday, July 6, 2015

The irony of "Internet time"

Normally, moving code and data from one server to another is something I try to do in the middle of the weekend (e.g. late Saturday night).  Alas, even the best-calculated schedules sometimes are thrown awry.  In this past weekend's move--a.k.a. "migration"--I was actually ahead of the usual curve of permissions not being correctly set up (which is normally the biggest show-stopper).  But then my connection to the server would drop intermittently.

I've chosen the new hosting company for this application (and others yet to come) largely based on the fact that customer service is a matter of sending an email or picking up the phone.  Not punting a form submission into a queue picked up halfway around the globe.

True to form, debugging has been ongoing since Saturday evening, and the preliminary diagnosis is a DNS issue.

Now, if you're not in I/T, the only thing you really need to know is that DNS (or the Domain Name System) basically functions as the phone book of the internet.   Networked servers, just like phones, are known by a number.  But we humans know the people (and companies) associated with those phones as names.  So, just as you would search WhitePages.ca by name (and city) to find a number, your web browser queries a DNS server to translate the human-readable "www.duckduckgo.com" to the network address "107.21.1.61."

That lookup and translation happen so immediately and (usually) so seamlessly that it's easy to take for granted.  (Unless, of course, you're a Rogers customer.)  Until it doesn't work and a website you know is legit. 404s.

Unlike many other networking issues, DNS problems can take a long time to completely resolve.  The reason is that when a web URL moves from one server (which is a number, remember!) to another, it can drop off the internet's radar.  That's because not all DNS servers update at the same rate--some of them can take up to 72 hours.   It's the price we pay for the decentralisation (and thus the robustness) of the internet.

But boy howdy, does a potential 3-day lag ever slow down debugging.  Not to mention that having a key feature of the modern internet move so glacially feels more than a bit ironic when everything else has been speeding up over the past 20+ years.   But it's not like irony and the internet are strangers, right?

Tuesday, April 14, 2015

Diversity in I/T

Last night, after the Moncton User Group meeting, I tagged along to the Irish pub to socialise and hear war stories.  As is usual with such convocations, it was time well spent.

One thing I've found funny and frustrating about software folks (all other left-brain proclivities aside) is their our tendency toward "religion" when it comes to platforms, languages, coding styles, source control tools, development environments, you-name-it. 

Yet, at our table we were two Mac users, one Windows user, and one Linux freak--at least when it came to primary operating systems at work. Bi-fluency in O/Ses is certainly not uncommon, and tri-fluency not unheard-of.  One Ruby on Rails dev., one .NET dev., one Go dev. and one PHP dev.  Two Git users, one Mercurial user, one unknown.  One SQL Server user, one Postgre user, one MySQL user, and one of unknown database preference.  Two emacs users, one Visual Studio user, and one Geany user.   But we unanimously agred that keyboard shortcuts trump the mouse when editing--which spawned a hands-on demonstration in how to select/insert multiple lines in emacs.

I trust I've made my point?

If even folks like us (who are notoriously rabid about our tools) can have that fun (and productive) of a time over a few beers (and pineapple juice...more diversity), what the heck is wrong with the rest of the world?

Monday, April 6, 2015

A belated "Happy Birthday"

My bad.  I knew last week that Microsoft's 40th birthday was coming up, and then completely spaced it out over the long weekend.  My brush with the company actually came fairly late in a Generation Xer's development.  My trajectory was TRS-80 to Apple II, followed by a three-year hiatus, and then MS-DOS alternating with early Macintosh for a bit, finally settling on Windows 3.1 and so on up.

These days the Windows development stations don't have much to do except wait for updates and the day when one will end up doing the duty of both and the other is recycled as a home server.

But even as a Linux user, I realise that monoculture, even in operating systems, is infeasible. Particularly in my trade, which is expected to support Mac, Windows, Blackberry, Android, iOS, and various flavours of UNIX.  Moreover, as both the Irish Potato Famine and the rash of script-kiddie-fueled havoc of the 2000s taught us, monoculture can be downright disastrous.

Still, it would be distinctly un-Hufflepuff of me not to acknowledge the audacious moon-shot that Bill Gates, Paul Allen, et. al. (nearly) pulled off.  A computer on every desk, running DOS/Windows (depending on the decade).

It wasn't Borg-like single-mindedness, nor even plain luck.  Painting with a broad brush, some of the key factors in Microsoft's ability to make three billionaires and a whopping 12K millionaires from its 1996 IPO are, IMLTHO:

The Economics of Complementary Products:  Meaning that if the price of peanut butter drops, you should be able to sell more jelly as well as more peanut butter (even if the price of jelly remains stable or even rises a little).  Personal computer prices were plummeting even as their capabilities followed the logarithmic trajectory of Moore's Law.  That made it easier to justify spending more money on software.  The trade-off would become more opaque as new PCs and laptops began shipping with pre-installed copies of DOS/Windows.  And then the greater volumes spurred more efficiencies (some good, some appalling) in the electronics industry, and...well, here we are...

Apple in a Tail-spin:  Jobs had been pushed out by Scully and Woz had bailed as Apple's first act was fizzling out.  (If my Gentle Reader's memory of Apple before iPods and Macbooks is a little fuzzy, think Robert Downey Jr. between Ally McBeal and Ironman and you're pretty close to the mark.)

Stepping-stone Partnerships:   After Microsoft adapted and optimised (or in programmer parlance, "ported") the CP/M operating systems for IBM PCs, it retained the rights to its own version of the resulting PC-DOS.  As PC clones rushed into the market, the go-to operating system (MS-DOS) was no-brainer.  Microsoft also collaborated with IBM on the OS/2 operating system, but DOS (and other business ventures) eventually proved too lucrative, and Microsoft dissolved the OS/2 partnership.

A Bigger Bogeyman:  Strangely, there was once a time when Microsoft was an underdog.  It's generally accepted that, as the PC boom was mushrooming, IBM made superior hardware.  Alas, at a higher price-tag and with absolutely zero intention of following anyone's standards but their own.  PC-DOS and OS/2 work notwithstanding, Microsoft found itself part of an intrepid little band that also included Intel and Lotus in the battle for how RAM is used by programs.  (And people cheered them on!)

"The Second Mouse Gets the Cheese":  Because first-mover advantage is overrated.  Particularly in new markets where consumers aren't sure what they're supposed to want yet.  I think that fairly characterises the mid-1980s when people wanted a computer because all the cool kids had one.  (As, later, they just had to have their AOL, and Facebook, and iGadgets, and...)

"Knowing Where the Bodies are Buried":  Microsoft's inside knowledge of their operating system gave an insurmountable strategic advantage to developers of the Office suite of products.  In other words, it's developers could take short-cuts through the operating system that companies like Lotus (of 1-2-3 fame) or Corel (WordPerfect) couldn't even know without first reverse-engineering. 

Lack of Complacency:  At least in the 1990s, Microsoft was highly jealous of its dominance, and targeted threats--real and imagined--with extreme prejudice.  Which was reflected in Gate's now-(in)famous "Pearl Harbor Day" reaction to an entire realm of software (a.k.a. the internet) that had materialised outside the fiefdom of the desktop operating system.  As many of us likely remember, shenanigans ensued.  Any web developer who still has to support Internet Explorer 8 could probably make the case that we're still paying for those shenanigans today.

Embracing Hardware Diversity:  It's basically the network effect, in that the value of a network is a function of the square of its number of members. Thus, the more things you can do with a computer, the more valuable it is.  That encompasses hardware as well as apps.  And goodness knows I've wasted enough (sometimes fruitless) time with Linux and wireless to appreciate when "it just works."  Yes, there's a Microsoft Mouse, and the XBox, but overall, Microsoft has not let itself become too distracted by new hardware form-factors.


Alas, Microsoft's relationship with third-party developers, historically, has enjoyed levels of dependency and dysfunction normally not found outside Tennessee Williams' plays.  Obviously, there has been a lot--and I mean a LOT--of (ahem!) "unsportsmanlike conduct" perpetrated from Redmond, WA.  Let's not gloss over that, not by any means.  Nope.

Worse, I'm not even referring to Steve Balmer's lizard-brained "cancer" characterisation of open-source software.  Microsoft took a PR black eye after it copied a feature from Blue-J, then initiated a patent filing--even after management was aware that the work had been pilfered.  In that instance, MS (wisely) backed off.  But Windows developers are/were (or should be / have been) all-too-aware of the truth in the ancient Greek blessing "May the Gods stay far from your destiny."  To wit:  Make a Windows app. or plug-in that's too successful, and Redmond might just bundle a knock-off with the next version of Windows (or the applicable application).  Which puts said developer out of business--after taking on all the risk and sweat of prototyping and marketing.

But, hey, at least that's a real product, and not mere weaponised marketing--a.k.a. vapourware.  I'd hate to know how many companies or proto-companies tanked simply because someone at Microsoft whispered in the ear of Slashdot or Wired or PCWorld or what-have-you.  

Yet.

And yet...

When it's all said and done, I'm not entirely convinced that my Gentle Reader and I would know each other were it not for the aforesaid moon-shot  Where I used to work, we had the acronym "BHAG" (pronounced "Bee-hag") for Big, Hairy, Audacious Goals.  And there is something--actually multiple and varied somethings--to be said for a company that can score a BHAG.  Despite the lean, uncertain times.  More importantly, despite the smug, Dom-and-Beluga fat times.

Microsoft, for its all its sins (both myriad and legion), literally changed the world.  And I think that needs to be acknowledged...even by open-source partisans such as your faithful blogger.

Thursday, April 2, 2015

Rational superstition

A couple days ago, I was ready to upload spanking new code to its web server, to find that the FTP server wasn't accepting my password.  Just in case I was imagining things, I ran through every single password this hosting account has had during its history.  No joy.

I won't mention the name of the web hosting provider, but let's just say that they're middle of the road.  By which I mean not the kind of outfit that will ride out Global Thermonuclear Armageddon with five nines, but not bottom-feeders catering to spammers either.

So I logged into my account and went to the control panel interface to reset the FTP password.  Or tried to.  Instead of being automatically passed through from the main account page, I was again challenged for a password, and again every possible option failed.  Now I haven't changed the password, so the "Who's been eating my porridge" alarm bells went off.  Mercifully, I still had sufficient access to my account to be able to submit a support ticket.

This is where my experience as the resident SysAdmin comes in handy--not so much that I have all the tools I need to debug the problem, but that I can speak the dialect of those who do.  Which includes having a good idea of what they might need from me, and trying to supply it before they even ask.  The preliminary diagnosis was a blacklisted IP address.  Because I'm in Canada talking to a U.S. provider, I guess that wouldn't entirely surprise me.  (Nothing crosses the border easily these days, don'cha'know?)

But upon further review, my IP address was found to be above suspicion, and seeing no other flags, the support tech simply rebooted the firewall and the problem disappeared as quickly as it had appeared.  Rather like one reboots a PC/laptop to "fix" an intermittent, unexplainable problem.  Or we reboot the router when we know darned well that Rogers (our ISP) needs to replace the gerbil powering their DNS servers.  Or how I deal with a repeatedly crashing text app. on my Android phone by restarting it.  Or back up email before installing system updates.

Bottom line is, even if were able to run any gremlin to ground, we still might not be able to chivvy it out of its lair to finish it off.  At least not without a lot of collateral damage.  Simpler just to reboot, as though the process is some ritualised purification ceremony that exorcises the demons.

Sigh.  As much as I'd like to believe that we I/T folks are ruthlessly logical and relentless in pursuit of root causes, we don't often live up to the standard.  Granted, we're nowhere near as superstitious as some breeds--notably actors--but I can rationalise it as efficient laziness, yes?

Wednesday, January 7, 2015

Third dimension of debugging

It's pretty much become an I/T clichĂ© that software peeps blame the hardware, and hardware peeps blame software for any particularly nasty gremlins. 

The phenomenon dates, more or less, from the PC boom of the 1980s and 1990s.  Not only were there more players in the hardware market, but the standards were somewhat more in flux.  (Anyone foolish enough to whine about accidentally buying a mini-USB cable to charge their micro-USB phone within the earshot of any SysAdmin of a certain age can--and should--expect be summarily wedgied.)

The standards issues leaked into the software development as well.  For instance:  Even during my late-90s stint in Programmer School, I remember losing points on a test because I assumed the number of bytes in the data types were based on Borland's C-language compiler rather than Microsoft's. (If that seemed like gibberish to you, don't sweat it; just trust me when I say that it was kind of a big deal Back In The Day(TM).)

Then along came the (commercial) internet.  And then broadband internet.  Which, in addition to opening unparalleled opportunities for data exchange, opened unparalleled avenues for data theft, denial-of-service attacks, and the digital equivalent of graffiti.  Thus did using a PC on a network lose its dewy-eyed innocence. 

Enter the firewall.  In laypersons' terms, it's a device that can be configured to allow or deny network traffic based on where it's coming from or destined for, and also what channels (a.k.a. "ports") it's using.  (Firewalls--most famously Windows Firewall--can be software-only and designed to protect a single computer, but that's another story for another time.) 

As an analogy, think of an airport operating when zombies have taken over another part of the world.  Flights originating from entire continents might be turned away altogether.  Flights originating from "authorised" locations might be allowed to land, but are still restricted to certain gates.  Similarly, planes might not be allowed to fly to questionable destinations, or are only allowed to use certain terminals or runways.  Any overrides/exceptions to those rules, of course, are potential weak points in the system that heighten the risk of zombies taking over the entire world.

Then, too, security for groups of networked computers inside the same network has evolved, mainly to minimise the damage a rogue operator can do.   Folders and whole file systems, servers, databases, what-have-you can require separate passwords, or only allow you to do certain things (such as read but not update).  This is analogous to how airports are segmented into terminals, have restricted areas, etc.  It even, to some extent, mimics the redundancies in security one experiences with a connecting flight on an international trip.  Usually, that's a more seamless process than the idiocies of post-9/11 Security Theatre.

Usually.

This week has not been one of those times.  Thus do I find myself doing the legwork for a junior support tech at a shared hosting provider's Help Desk.  (Grrrrrr.)   And realising anew how much more I'm going to have to stay on top of the networking & security aspects of maintaining a server than I have in years past.  It's not so much that networking/security technology has taken a quantum leap and I'm playing catch-up.  No, it's mainly a human issue.  To wit:  When I can no longer trust the word of the wizards behind the consoles, it adds another dimension to debugging.  Debugging, I might add, that sometimes has to be done under fire in production.

These days (responsive web design and the usual Android/iOS tug-o-war excepted), hardware ranks relatively low on the list of usual suspects when a gremlin pops up.  Network and database permissions, on the other hand, have shot up that list to a near-permanent slot in second place.  Suspect #1, naturally, is always my own code.   But when I haven't touched said code since early Dec., and scheduled jobs suddenly stop working around New Year's Eve  (holidays being a favourite time for techs. to sneak in updates)?   Ding!Ding!Ding!Ding!Ding!Ding!  Ladies and gentlemen, I believe we have a new winner.

All of which sounds like whining on my part, and I'm really not.  (Pinky-swear!)  I'm merely hoping that this facet of a custom software developer's trade helps to explain why it's not a trivial activity.  And why we don't do for free.  (Note:  Code written to scratch our own itches before we let other people make copies of it is not the same thing as "free.")  And why we don't appreciate hearing, "This should be easy for you."  And also why we are so darned insistent that our clients actually try apps. out for themselves before turning them loose on the world.  Because "It works on my network" is the new version of the dreaded "It works on my machine."

Monday, October 20, 2014

Generations

Both my parents spent the majority of their careers working in a hospital environment.   (If you want a good working definition of corporate benevolence, it would be in how my Dad's supervisor said absolutely bupkis about how long it took Dad to return from his maintenance jobs while Mom and I were in the maternity ward and nursery, respectively.  'Nuff said.)

Both parents, however, are at the stage of life where they're more likely to experience hospitals from a customer's, rather than an employee's perspective.  I called Mom today to check in on her progress after surgery a couple months back.  (No, it's not the first time I've called her since then.  Even I'm not such a horrid child as that.)  For the record, she's back out in the yard, shelling walnuts, fully up-to-date on the doings of the wild critters and feral cats, etc.  Business as usual, in other words.

Mom mentioned that the hospital where she'd worked until a couple years back had asked her if she was interested in volunteering.  She said "no."  Which didn't surprise me--she has enough going on right now, even without recuperating from her third surgery in three years.  But then I had an ear-full about everything her former employer has outsourced since she started working there--which, for context, was during the Carter Administration.

Food service, IIRC, was the first to go.  Now housekeeping has been outsourced.  So has billing.  Because, of course, nutrition has nothing to do with health.  And neither does cleanliness.  (My Gentle Reader naturally picked up on the sarcasm there.)  And when, thanks to data-sharing limitations, my Mom is batted like a ping-pong ball between Accounts Receivable and at least two insurance companies when she's still half-whacked-out on painkillers, I'm going to take a dim view of outsourced billing.   [sharpens fingernails] [bares teeth] 

I have a lot of fond memories of visiting that place when I was growing up:  The smell of acetone, the superball-bouncy scrambled eggs in the cafeteria, the stately pace of the elevators, the brain-in-a-jar in the Histology lab (true story).  But I can also understand why Mom turned them down, too.   So far as I can tell, it's still a clean, orderly, almost nurturing place.  But the ghosts of the nuns who built and poured their devotion into it become more transparent with every contractor who slings an ID-card lanyard around their neck.

Fast-forward a generation--meaning me--and press the "zemblanity" button, and there's today's news about IBM selling its semiconductor business to GlobalFoundries.  It's certainly not unprecedented, given IBM's sell-off of its PC/laptop business to Lenovo a few years back and some of its server business earlier this year.  Except that this isn't your typical offshoring:

GlobalFoundries will take over IBM manufacturing facilities in New York and Vermont, and the company "plans to provide employment opportunities for substantially all IBM employees at the two facilities who are part of the transferred businesses, except for a team of semiconductor server group employees who will remain with IBM."

Thus, at least in the near term, GlobalFoundries will employ IBM expats at IBM facilities to make a profit at what, for IBM, was a money-pit.  And IBM's taking a sesqui-billion-dollar hit on the deal besides.   Slow-clap for IBM management there.  (Sarcasm again, btw.)

Granted, I haven't seen the inside of the Blue Zoo since Lou Gerstner was yanking the platinum ripcord on his golden parachute.  And, even then, being "contractor scum" insulated me from the insane (unpaid) overtime and pension-jigging and attrition-by-early-retirement and other assorted idiocies inflicted by the buscuit-salesman.  But my frustration really boils down to one similar to Mom's.  Namely, that the definition of "core competence" has become dangerously strict.

Now, I'm certainly not arguing in favour of vertical monopolies.  The fact that Monsanto is allowed to GMO a seed specifically optimised for the petro-chemical atrocities they market frankly blows my mind.  As Bruce Sterling put it, "Teddy Roosevelt would jump down off Mount Rushmore and kick our ass from hell to breakfast for tolerating a situation like this."  And he's absolutely right--even when he was talking about software monopolies.

Maybe I've just been out of the server space for too long.  For all I know, pushing mission-critical data off-site to cloud servers doesn't give CIOs the willies it would have given them a decade ago.  Maybe Microsoft has finally earned enough enterprise-computing street-cred to muscle out the Big Iron in the server-room.

But I do know that outsourcing always entails friction and a certain amount of bridge-burning.  In the case of Mom's ex-employer, it's orders of magnitude easier and less expensive to retrain (or, if necessary, fire) a under-performing employee than it is to cancel a contract and find a replacement when work isn't up to snuff.  When you're a technology company that's weathered two decades of commodisation in both hardware and software by optimising one for the other, throwing away that balance makes no strategic to me.

When I look at the risk of things IBM flags as higher-margin (cloud, data and analytics, security, social and mobile), there is not one of them that I would flag as being "owned" by Big Blue.  (Cloud?  Amazon.  Data?  Oracle, with Microsoft hot on their heels.  Analytics?  Everybody's a "big data" guru these days.  Security?  Nope.  Social?  Please...who isn't gunning for Facebook?  Mobile?  Are we seriously expecting for an IBM phone?)

I owe IBM credit for making me a decent technical writer and for teaching me basic white-collar survival skills.  Oh, and for disabusing me of the notion that working inside the belly of the leviathon is securer than working outside it.  But apart from my comrades and the cafeteria/coffee-shop/cleaning ladies, there's no love lost for the big blue behemoth on my end.

Yet it galls me to see a company that lionised its R&D department (and the patent lawyers who filed every brain-wave thereof) hitching their wagons to other people's horses.  Or, perhaps more aptly,  jumping onto other bandwagons.  Because bandwagon-passengers forfeit their right to drive, right?

Monday, September 29, 2014

Shellshock, herd immunity, and the implications for the "Internet of Things"

Sometimes the analogies used for computer stuff can be a bit off-the-mark.  For instance, cordless computer "mice" aren't called "hamsters."  Computer "viruses," on the other hand, are a spot-on description.  Some are transmitted through a physical vector (historically, through a storage device such as a floppy drive or USB stick).  In other cases, transmission is seemingly airborne (such as via unencrypted wifi).

Continuing the analogy, UNIX-based systems such as Linux and Mac OSX have enjoyed the luxury of a certain "herd immunity."  In part because UNIX-based (a.k.a. "*nix") operating systems were designed to be networked, which meant that defences were baked in from the get-go.  Also, in the past, most everyone was running Windows.  And Windows users--typically but certainly not always--tend not to be the most tech-savvy.  Which, for anyone in the market for notoriety, put the virus-writing bulls-eye on Windows.

Which gave rise to a certain smugness among *nix users that ranged from a sighroll (When will you people learn?") to outright pointing-and-laughing whenever the latest Windows virus was making the rounds. 

Last week's "Shellshock" vulnerability may well have brought that smugness to an end...at least blunted it for the foreseeable future.   Mainly because the free ride of (perceived) "herd immunity" has come to an end.  Apple's OSX has definitely spread from the Graphics Department to other areas of business.  Also, the overwhelming majority of web servers are based on some flavour of Linux.  Additionally, embedded devices are increasingly based on Linux.

For programmers like myself who do the bulk of their work on Linux laptops or desktops, most of the distributions (e.g. Ubuntu, Red Hat) make it stupid-easy to install security updates.  Ubuntu, for one, definitely gets up in your grill about them.  (Plus, we generally know better than to ignore them for any longer than absolutely necessary.)  By contrast, an Android tablet (which is Linux-based) like my Nexus 7, can be more obsequious, like the velvet-footed, quasi-invisible Victorian butler.  (That tiny notification icon in the northwest corner of the screen murmurs, "Would Milady graciously condescend to update her tablet now, or after tea?  Very good, marm."  But no more than that.)  Which is dangerous from the standpoint that updates are easy to ignore, but it's the tablet that's more likely to connect to that coffee-shop wi-fi.

And the proverbial web-enabled toaster?  Fugeddabouddit.   Its manufacturer was too busy trying to squeeze half a cent off the unit cost (to appease Walmart's accounting goons) to worry about releasing software patches.  And that's precisely the problem with the Internet of Things.

And while I don't necessarily like to call for yet more regulation, I think any government that waits until after a catastrophic network attack to require patching is a reckless government.  The two major barriers to such a catastrophe are well on their way to becoming null and void.  Clearly, the industry cannot be trusted to police itself.

Cost was the first barrier.   Unlike Windows or OSX, Linux is free to use.  It is--just as importantly--also free to hack up and modify to suit your particular hardware.  Either factor is enough to make Linux a no-brainer for electronics manufacturers.  Which is fine for standalone gadgets.  But when they're exposed to a network (and by extension other devices), an evolving immune system is a non-negotiable.

There's another dimension to the cost factor, and that's in hardware.  Previously, only full-size PCs or laptops had the processing power and memory to support a full-blown operating system.  Not anymore, when the $40 Raspberry Pi runs off an SD card like the one you probably already have in your camera.  Eventually, the only limits on size will be imposed by the need to connect it to monitors and standard ports or peripherals (these days that means USB devices; who knows what hotness tomorrow will bring?).

All that, for a manufacturer, means that they don't have to spend the money and time to develop a custom operating system; they can go with a no-cost off-the-shelf platform.

The second roadblock, scarcity of always-on connectivity, is now disappearing, as public libraries and city buses and in some cases entire cities offer free wi-fi. 

The upshot is that we'll have an ubiquitous operating system that is less than likely to be immunised against the latest viruses.  And it will be living in the (metaphorical) Grand Central Station of the internet, exposed to any and all comers.  Pandemic is not a risk; it's a certainty.

I know that most people could care less if their cyber-toaster tells a black-hat hacker or the NSA or the CSEC that this morning's raisin bagel was (gasp!) not gluten-free.  But that's soooo emphatically not the point here.  Why?  Because hackers do not always want your data.  In many cases, they want to siphon your processing power and your bandwidth so they can use it to attack those who do have juicy credit card numbers or email addresses or passwords. or naked celebrity selfies or whatever.  Which ultimately means that when anyone's too lazy to keep up with patches, they're aiding and abetting the enemy.  And complacence, as we know from politics, is complicity.

Naturally, my Gentle Reader is too savvy and hip to be slovenly about such things.  They fully appreciate that even their Fitbit has orders of magnitude of computing and communications power beyond what put people on the moon.  And they, beyond question, have the instinctive class and sense of noblesse oblige to know that with great power comes great responsibility.

Of course they do.

Tuesday, September 6, 2011

(Yet another) Sign of the times

I tried to log in to Twitter earlier today, only to be greeted by the trademark Fail Whale and the uncharacteristic (of late, anyway) message that the web's foremost ADOS application was "over capacity."

But rather than immediately roll my eyes over scalability growing-pains, my first thought was to wonder where the earthquake/tsunami/hurricane/tornado/revolution had hit. (Or, more cynically, which overrated celebrity had died.)

Of course, nothing of the kind had happened (at least not anywhere off the "Hic Dracones Sunt" area of the American mind-map.) But I thought the fact that it was my first instinct to assume that the disaster was outside Twitter's server-room--and the fact that I didn't question this until some time later--was interesting. If I'm not alone, then I think Twitter should be congratulated on a serious milestone. (Good job, y'all.)

Tuesday, July 26, 2011

Touche.

The "car doctor" for my 15 year-old beastie changed ownership somewhat recently. I've been pleasantly surprised to notice no difference in the faces nor the service since then. But, as the courtesy van driver--somewhat older than I--schlepped me to work, I made conversation by asking how things had changed.

Naturally, I was expecting a diplomatic answer. But he went on quite convincingly about how all the same folks were in place, and the former owner had made himself deliberately scarce, practicing for retirement. Which all warmed my heart, until I was collecting my backpack and clambering out of the van and he said, "Nope, the only thing that's really changed is the computer program...and that's what takes a fellow the longest time to learn."

I'll confess that I didn't have the moxie to tell him what I do for my crust. But...point taken.

Wednesday, June 22, 2011

Short-selling the dinosaurs, or "Here we go again..."

With the rise of the smartphone, the attendant hype has included some talk about the "ghettoization" of the internet--in the sense that "the internet" is defined as content snarfed from one or more web servers from a laptop or even horrifically retro desktop computer. Yet, as I read yet another "the reports of my death have been greatly exaggerated" article, titled The Fall of Wintel and the Rise of Armdroid, it occurred to me that the coming "ghettoization" may not be drawn along the lines of content producers vs. consumers as along content itself.

The distinction between playing a game on a small screen and everything that goes behind it (interface design, scaling data and processing over multiple servers and writing/testing/deploying all the code that makes that happen) is the distinction between the proverbial tip and the iceberg. (Even minus Kate and Leo and a whole lotta CGI). I hope we can agree on that.

Disclosure: I don't own a tablet or smartphone, per se. (Yet.) A netbook--with a keyboard that would have put Margaret Mitchell ("Gone with the Wind") on the sidelines well before Atlanta was toasted--yes. And I've certainly been accused of shallow thinking. And not just recently, nor without justification.

Which, I'll admit, makes it seem more than a little pretentious to swim against the tide of "conventional wisdom." ('Cuz when business writers predict long-term computing trends, it's totally like, "Gartner data-point. Your argument is invalid." 'Nuff said, right?) Even against the swaggering conventionality of dudes like Mr. Allsworth--who, so far as I can tell, think they're scooping the meteor from "Fantasia" just as the dinosaurs double-take the bright light in the sky like some chorus line of "Durrrr." Because we all know how sharply striated the mainframe-to-minicomputer-to-PC adoption was, yes?

Mockery aside, I think I can safely predict that we're living in a Golden Age of niches--perchance even a Cambrian explosion of computing life-forms. Simply because hardware is cheap, software alternatives range somewhere between "cheap" and "free" and tying together systems is not limited to dedicated telephone wires--owned, I might add, by a monopoly. Making the statistical likelihood of such one-or-the-other thinking rather on part with being struck by lightning during a shark attack.

No doubt 24/7 availability of fully networked computers responsive in a more three-dimensional sense will change the equation somewhat. But the fact remains that small screens with cramped user interfaces are geared to forms of content for which a desktop in which you can immerse yourself for twelve hours straight (thanks to three monitors, keyboard, mouse and who-knows-what-besides) are thermonuclear-scale overkill.

For instance: There's snapping a photo, cropping it, tagging it, uploading it--yea even with LOL-caption. There's firing off the multi-person SMS message otherwise known as a tweet--or even skinny post. Stupid-simple, and as close to "free" (in terms of time and money) as possible for both the creator and the recipient. Then there's the longer-term commitment of content on the level of, say, "Avatar" or "Inception" Even bootlegged copies carry the cost of going on two hours of time. (And, in an economy where too many work more hours for less compensation, don't ever make the mistake of discounting the value of "idle" time!)

Seriously now...will the next Lady Gaga video be mixed on an iPad as facilely a throwaway iApp can make caricatures of your photos? Me, I'm thinking not. And not only from the standpoint of raw computing power--something that typically comes in inverse proportions to the prized battery life of such devices. A multi-screened Mac, fully accessorized, by contrast, will capture the nuances that dumbed-down resolutions and tinny, cheap earbuds will not. All the difference in the world between a handful of Facebook friends and millions of "L'il Monsters," in other words.

In short, content is not created equal. Either in the creation or the consumption, I might add. And never will be. Just like sometimes you can get by with the "fun-size" Snickers bar you poached from the communal candy jar--the calories don't count if you pitch the wrapper in your cube-mate's waste-basket. Honest--I read it in "Scientific American." But at other times nothing short of the infamous seventeen layer "death by chocolate" volcano cheesecake torte from the local Tchotchke's will do.

Or something like that.