Thursday, December 10, 2015

Sane, rational paranoia

So I've been grinding away on a database implementation for a few days now, and today passed a milestone peculiar to database geeks.  That's the one where the number of data tables is surpassed by the number of functions (and stored procedures) written to read, update, and delete that data.

It's the point where I start to feel safe--as safe as you can be allowed to feel, anyway--what with data hanging out on a web server. Once, I thought of that "safe"(r) feeling as mere superstition.  And, to a degree it is.  Just like any situation where you go all-in on one factor.

In short, this flavour of coding is kind of thing logicians call "a necessary but not sufficient condition" for a reasonably safe application.  A lot of different skills and roles play into that.  And, even getting all the technical stuff right means bupkis if the human factor fails.  ("Hi, this is the County Password Inspector calling.  We need to verify that your password is strong enough..."  [insert involuntary twitch])

That being said, there's no excuse for not doing it.  Or for not automatically distrusting every bit of data that wants to be written to your database. 

Thing is, writing database code is just repetitive enough to be boring, but just quirky enough in its logic that I don't automate the process beyond cut-paste-edit.  And once that code is in place as the "gatekeeper" between the main application (web, desktop, API), the main application's code similarly walks the line between "boring" and "can't afford to mail this in." 

Worse, it's slow business.  And not simply because so much is being written from scratch instead of copied and adapted.  This is also when the nitty-gritties that fell through the cracks of the design documents have to be fleshed out.  Even when the developers were part of the design team, this means round-tripping back with the major stakeholders.  Which means they're in meetings rather than coding.  And I can pretty much guarantee you several "Uh-ohs" occasionally punctuated by the more serious "Oooooops."

Which is why it can seem frustrating to the (non-coder) manager that resolving questions last week spawned even more questions this week.  That means your developers will spend even more time in meetings and less time doing what they're nominally paid to do.  And as a developer watching the original milestones loom (and possibly slip), it can be awfully tempting to take the short-cuts.  By which I mean querying, and (far, far worse) updating and deleting data directly in the web code. 

Now, I don't have any proof whatsoever, but I would not be in the least bit surprised to learn that those sorts of shortcuts were behind last month's VTech hack.  Because the smoking gun that left the information of 6.4 million children and adults exposed was a SQL injection hack.  "What's a 'SQL injection hack'?" a non-coder might ask.  I'll let "explain xkcd" well, explain it, because A.) He does it better that I did, and B.) Talking about SQL injection and not linking to the iconic xkcd comic (which I really need to get on a t-shirt)  is like going to RenFest and not making a single "Holy Grail" joke.  It just isn't done, mes amis.

Like I said, zero proof of my hypothesis.  Whichever way, though, that's bush-league work.  Let that sink in for a second:  I'm a freelance coder working out in the wilds of Grande-Digue, New Brunswick, and even I roll my eyes at that kind of sloppy naiveté.  Because SQL injection isn't half so much a technical problem as it is a human one.  And, honestly, if you have millions of customers, you can afford to hire solid, experienced, disciplined programmers, and let them do their jobs.  Which involves indulging the twitchy spider-senses developed from years of painful mistakes.  Normally, you'll only find that level of paranoia in ammosexuals and StormTrumpers.  But programmer twitchiness the sanest and most rational paranoia you'll ever encounter.  Trust it.

Monday, November 23, 2015

Hacking the health of a planet

Today's edition of the Toronto Star carried an item that made me smile in three different senses:  Scientists Hack DNA to Spread Malaria-Resistant Gene.  On a purely topical level, this could be A Really Big Deal.  That makes me smile.

On a purely geeky level, the use of the term "hack" was encouraging.  I/T folks like myself have been trying for years to make the distinction between "hackers" (the DIY tinkerer types who have no time for Apple-level polish) and "crackers" (the folks who keep the credit monitoring firms in business...and regular I/T folks awake at night).  Alas, the rest of the world doesn't make that distinction, so even white-hat "hackers" are tarred with the same proverbial brush.

And yet...no one (geek or non-geek) likes mosquitoes, so who wouldn't get behind "hacking" their DNA, amirite? [grin]

But the question of semantics and PR is the least of the problems here.  Because I yet again have to smile--albeit wryly--at the huge obstacle common to hackers across all disciplines.  Namely, the gulf between a working prototype in the lab (or hackathon-occupied conference room) and full-scale adoption in the real world. 

Don't get me wrong--this is really-most-sincerely NOT schadenfreude.  Half a million lives a year are at stake--most of them children.  And the health of two hundred million more people each year is in balance.  One would be very, very hard-pressed to overestimate the significance of eliminating mosquitoes as a vector for malaria.

There are 3,500 modern species of mosquitoes, and their lineage dates back 226 million years.  Before deliberate "gene drive intervention," humans were already triggering the development of new species by dint of pesticides.  "Hacking" every skeeter on the planet will be a knock-down, drag-out slog of decades.  But it's a worthy fight...and it sure beats the tar out of poisoning the ecosystem with whatever hell's broth Dow is cooking these days.

As a programmer/tinkerer, my goals aren't even a tenth so audacious and world-changing.  But that doesn't mean that I can't appreciate the scaling issues.  And the wherewithal it will take to surmount them.  All the best, my fellow hackers...you're going to need it.  And then some.

Thursday, November 19, 2015

Permission vs. Excuse

We all know that one person.  Likely we know multiple flavours of that one person, but let's just generalise for simplicity here.

That one person of whom I speak has the Great Idea.  Or above-average talent.  There's no reason why they shouldn't do/make The Thing (whatever it is) that would make the world better.  Except that they're being stalked by failure.  And so your offers to link them with people people who could help them are sabotaged, if not rebuffed outright.  Your encouragement disappears into a black hole.  The time, you see, is never quite right.  There are too many people willing and able to take advantage of them.

We might, even to a fractional degree, be that one person at some point in our lives.  You know, making up excuses before we even do or make The Thing.
The bad news is that we will fail.  The good news is that we will fail more than once.

Which sounds illogical until you consider how empowering that is.  Give yourself permission to fail, and you will be more than equipped for the next failure.  And the next.  And so on. 

Giving yourself permission to fail is not the same as making excuses ahead of time.  The saying goes that it's better to seek forgiveness than to ask permission.  This is one of the glaring exceptions to that rule.  Even so, seeking forgiveness for failure is likewise not at all the same as making excuses.

Excuses are pernicious things, and can rob us twice.  They allow us to declare bankruptcy on our responsibility for The Thing not working out the first time.  But they can also give us a pass on learning from failure.  Which makes it less likely that we will try another way to make/do The Thing...or The New Thing.

But with the permission to fail, it's incumbent on us to clearly define "failure" from the outset.  How will we know what failure looks like?   What's the plan for changing course to avoid a crash?  What can we salvage in the event we crash anyway?  Sure, those questions take some CPU cycles.  Then again, so does sweating the timing of The Thing, or paranoia that someone will steal The Thing from you.

Now, if my Gentle Reader will excuse me, I have to go practice what I preach and fire off an email or two so that I can get back to The Thing in earnest.

Wednesday, November 18, 2015

Crisis Management vs. Management by Crisis: A Cautionary Tale

In the year 1310 CE, The Most Serene Republic of Venice had hit one of the rockiest points in her 1,100-year career.  Simply put, she was at war on too many fronts.

La Serenissima was entirely dependent on trade -- not only for her prosperity, but survival itself.  Comprised of a collection of small islands (notoriously vulnerable to tides and earthquakes), she could no longer grow her own food, much less the timber for the ships that weighed anchor throughout the Mediterranean, Levant, Black Sea, and occasionally the Atlantic.  (The "John Cabot" who crashed Canada's party in 1497 in the name of England was in fact "Giovanni Cavuto," a citizen of Venice.)

But Venice's Middle Eastern trading post, Acre, had disappeared when that Crusader-held city fell back into Muslim hands.  She was by now paying the price for the greed and ruthlessness of her for-profit (cough!) "Crusade" (cough!) a century before:  She was on the outs with the Byzantine Emperor, and many of her principal Constantinople residents languished in prison.  Again. 

Closer to home, her on-again-off-again war with hated rival-city Genoa had resumed, and, for the moment, Genoa was in the lead.  And territorial squabbling over the city of Ferrara had brought down the wrath of Pope Clement V.  The latter, in addition to spiritual penalties such as forbidding baptisms, weddings, and funerals, also included an injunction to the rest of Christian Europe against trading with Venice.

One might expect that in times when a storm appears on the horizon, people would put aside their differences and face the common dangers united.  After all, the Venetians had certainly been more than capable of this before (and would be again).  In this case, however, one would be very wrong.

The fight with the Pope, for instance, had been the most avoidable part.  The "old guard" nobles of Venice favoured conciliation, while the nouveau riche patricians tended to be war hawks.  Debate had been fierce, and had even spilled into street-bloodletting.  But, in the end, the hawks had carried the day...with disastrous results.

Now, Venice's political system was All About checks and balances.  Relatively early-on, she had rejected the notion of hereditary monarchy.  At the top of her social and political order was a Doge -- a title that means "Duke."  In reality, the office was rather analogous to that of the U.S. President, except that it was for life (or retirement) and its powers decreased, rather than increased, over the centuries.  Once, "election" had nominally been a matter of popular accord; in later centuries its Rube Goldberg complexity made the U.S. horse-race of debates, primaries, and the Electoral College look straightforward by comparison.

In the late 13th and early 14th century, however, this office's authority was still very real.  The current Doge, Pietro Gradinego, being from the newer clans, had been the loudest of the war hawks.  And, in the threadbare tradition of arrivistes everywhere, his other (ahem!) "gift" to Venetian history was the consolidation of his class's gains by shutting out everyone below them on the social ladder.  Thus, only members of certain families were henceforth allowed to serve in what functioned as Venice's legislature, the Maggiore Consiglio.  Already tending to oligarchy, its exclusivity was given the force of law in 1299.

Despite suppression by the police, the unrest, demonstrations, and public blood-letting continued.  As Doge Gradinego's popularity plummeted, his arrogance increased.  The "old guard" (which included the Querini and Tiepolo families) plotted government overthrow.  For reasons unknown to history, they chose the dodgy, if dashing, Bajamonte Tiepolo as their leader.

As revolts go, this one was more than the usual let-down.  Tiepolo, for one, had absolutely no desire to roll back the oligarchy:  As far as can be determined, he planned to set himself and his allies up as the rulers of Venice, sweeping away centuries of constitutional safeguards.

But one of their co-conspirators lost the nerve (or stomach--it doesn't matter which) for the enterprise and had already informed on them.  A wild summer storm prevented one of their three contingents from linking up with the rest.  Another contingent ran into an ambush, and those who escaped had their attempts to regroup frustrated by the painters' guild (whose loyalties were certainly to the Republic, if not its Doge).  More ignominiously, Bajamonte Tiepolo paused his assault at an unlucky spot by the house of a old woman who attempted to kill him by dropping a stone out her window.  She killed his standard-bearer instead, but the confusion (plus the raging storm) took what wind remained out of Tiepolo's sails.

Surprisingly, the Bajamonte Tiepolo party (standard-bearer excepted) came out relatively unscathed, but only because they had the great, good sense to retreat to a section of the city more hospitable to their cause.  He was sentenced to four years' exile and his house pulled down, along with that of his Querini co-conspirators*.

But freak storms and feisty old ladies aside, the Venetian government had been lucky...and they knew it.  The next (and heavily secured) session of the Maggior Consiglio addressed the not-new problem of efficiently responding to sudden crises.  The issue was made more poignant by the fact that the Consiglio's exclusivity--which would only increase over the years, as the sons of noble fathers born to "commoner" mothers would be barred--set off a scramble to prove eligibility...and thus swelled its ranks.

Their solution was to streamline the handling of time-sensitive business (such as rooting out conspirators) by appointing the tribunal known as the Council of Ten.  (A misnomer, by the bye:  The Doge plus six more elected officials brought the effective head-count up to seventeen.)  True to the Venetian phobia of autocracy, checks and balances were baked in from the get-go:  Terms were limited.  Multiple members of the same family could not sit on the Council at the same time.  To prevent a single person from wielding too much power, it had three "captains," rather than one, and their terms were fixed at a single month.  Additionally, "captains" could not go out in public where they could be bribed or listen to accusations that were not vetted by the entire Council. And as the ultimate safeguard, the lifespan of the Council was set at two months.

As my Gentle Reader will doubtlessly be unsurprised to learn, that last fail-safe didn't quite work out as planned.  A clause allowing for two-month extensions to its mandate was invoked again...and again...and again...and the fiction of "extension" recurred for another two dozen years until the Council's permanency was made law in 1334.  Its term expired only with the Republic itself in 1797.

Do the math:  An "emergency" two-month response to a specific crisis lasted very nearly half a millennium.  During those five centuries, the Ten took on increasing power as, among other things, Venice's spy agency, secret police, Dept. of Defence, and diplomatic corps.  Also unsurprisingly, its secrecy and utter non-accountability earned it a sinister reputation, within Venice and western Europe at large.

In fairness, the Ten's efficiency to a large degree justified their extraordinary powers.  Checks and balances were kept, and even increased.  Alas, these became increasingly meaningless.  Two and a half centuries after permanently legalising the Council, even the Maggiore Consiglio was scarcely capable of exempting review of its own laws by the Council...and little else.  As La Serenissima glided into the dolce far niente era roughly (and ignominiously) ended by Napoleon, the Ten's corruption was merely one tumour in the larger cancer eating the Republic.

Yet as much as historians so often succumb to an over-dramatic tendency to hang their story around those at the very apex of society, in this case I consider that dangerous.  History is not biography, unless it's in a very collective sense. And that's what makes the events of seven hundred years ago and (for me, anyway) a continent away such a very modern tale.
  • Increasing concentration of power in hands that already hold much of it
  • Politicians willing to divide--even at the risk of being conquered
  • Those at the top of the social pecking order arrogantly treating the state as their private property...including as a weapon in their own feuds
  • Those nearer the bottom of the pecking order not uniting (and raising Hades) to stop their government from becoming a country club
  • Elective wars (including a for-profit one that destablised an entire region)
  • Government using a temporary crisis (self-inflicted or not) to justify a permanent power-grab
That's a story we've seen play out on the nightly news as well as the history books.  And we will continue to see these sort of things spool from newsprint into future history books until we collectively learn to step back from the crisis-du-jour.

Technology, true, is occasionally a bona-fide game-changer (although, it turns out, this happens less frequently than you'd think...and certainly doesn't live up to predictions).  But the human condition has been fairly immutable over the centuries.  Our biggest mistake in the West has been to think of our hothouse as the world and then freak out when an outside stone smashes a window.  And our second-biggest mistake is to deny that the stone bears our fingerprints (e.g. al-Quaeda, Daesh, the Taliban).

Ignorance of 13th/14th century Venetian history is perfectly understandable (unless, of course, you're a nerd that way).  But ignorance of recent history is inexcusable.  Most especially when we allow our soi-dissant "leaders" to let past missteps point the way further down the wrong path.

- - - - -

* A somewhat ridiculous coda occurred when the Querini house was found to be co-owned by three brothers, one of which had had absolutely nothing to do with his guilty siblings.  Because destroying only two-thirds of the house proved problematic, the Venetian government sensibly (and fairly) bought out his share and then razed the structure.  History can be goofy like that.

- - - - -

Bibliography:

Thursday, October 22, 2015

Quality isn't just another value-add

Previously, I mentioned that the Q-and-A period at this past Tuesday's Moncton User Group meeting ("Security Enterprise Architecture for Developers") basically resulted in two epiphanies for me.  The first, and more junior, one I riffed on during the previous entry.

My question to Jamie Rees had to do with "selling" security's value to your client as part of the application development process.  (Because we all know we should make apps. secure from the ground up, right?  But, then, we also know that we're supposed to floss every night, too.  And we all know how that plays out for most people.  Your faithful blogger included.)

I was thinking of I/T security in terms of risk management.  To wit:  A data breach costs money.  If data related to financial information (credit card numbers, Social Security / Social Insurance numbers, bank account numbers), the company whose data was leaked is typically on the hook for years of credit monitoring for each person affected.

Then there are the lump-sum costs.  Things like the hit to customer goodwill (which sounds really squishy, but there are accountants who specialise in quantifying that in hard cash).  Finding and patching the weaknesses in the system does not come cheap either.  Depending on the industry, third-party audits might be required.  And if the security lapses were super-egregious, heads will roll, which entails (at a minimum) the costs of hiring and training.

So I figured that this could be gelled down to a simple formula:

Average cost per breach * Likelihood of breach per year = Annual risk

If that "annual risk" (quantified in dollars) is greater than the budgeted amount for security in the Statement of Work, it should be a no-brainer, right? 

Jamie Rees had a few thoughts and suggestions, including the nugget that in security more than anything else, you have to protect yourself from entropy.  Because waiting until a crisis to fix a hole means that you will focus on that crisis alone.  But once that energy's expended, organisational fatigue will guarantee that there will few (if any) resources spent on proactively preventing the next crisis.  (Sound familiar?)

But as I was digesting this all down for my notes, someone else raised a hand and asked the question as it should have been phrased in the first place:  "How do I sell my clients on security without selling fear?"  And, wham!  Synapses linked up, proverbial light bulbs went on.  (For all I know, the heavens opened to the sound of angel-choirs.  But Suite 201 of the Venn Centre is really, really well-soundproofed, so don't quote me on that.)

For the record, Mr. Rees's answer boiled down to mapping security to the project goals.  (Like, I might add, y'do for everything.  We're All About the goals, not the features here.) 

But what hit me was security, really, is just another facet of Quality Assurance.  A very specific facet, it's true--and one perhaps almost large enough to overshadow its general category.

But the thing that's Quality Assurance has proven over and over since Dr. Deming pioneered the discipline is that quality ultimately pays for itself.  Namely, because focusing on quality forces you to take a hard look at your organisation and its processes.  A relentless focus on quality allows far less room for the politics of personality--which includes the always-regretable "rock star" culture.  And it has no mercy for the "But we've always done it that way" argument.

So, when pitching my services to future clients, you can bet that I will be pointing out how developing an application for their business buys them process consulting from an unbiased 3rd party as part of the package.  And all for the low, low cost of higher quality. :~)

Wednesday, October 21, 2015

When questions >= answers

So October's meeting of the Moncton User Group (@monctonug) was a bit different from the usual classroom-esque schtick.  Granted, the presenter gave a prepared spiel, followed by a Q&A period.   But, thanks to technical difficulties (i.e. the wrong laptop connectors), there was no PowerPoint.  Which can be a good thing, and in this case drove the content more into the realm of "war stories."

But I had two epiphanies, one small, and another not-so-wee.  Because I need to mop up some things before decamping for a meeting, I'm going to just focus on the wee one.

Anyone who's ever attended a conference (other than to collect tchotchkes and gawp at booth-babes...or just play hooky on the company tab) knows that the presentations are the hamburger patty, but everything else is the bun and toppings.  In other words, you don't just eat the patty.  (Not unless you have a lot of food allergies, I suppose.) 

Naturally, the networking is a big deal.  But so's the chance to pull the content of the presentation into your own context.  Normally that's done through Q&A.  But sometimes, as I discovered last evening, someone else's question is even more clarifying that your own.  Which is exactly when someone (with a lot more business development experience than I currently have) followed up my question with, frankly, the one I should have asked in the first place.  I don't like using the phrase "refined my thinking" because I think it's usually a fig leaf for "why didn't I think of that?"  Mind you, that's actually what happened, but it triggered a whole new riff in my head.

That riff is the subject of the next post.  But I thought that this insight might encourage folks to click that EventBrite link the next time they're sitting on the fence about a learning opportunity.  Remember, the burger is more than the patty.  You're there for the burger.

Friday, October 16, 2015

A red, green, and blue silver lining

The Arduino platform was created more for design students than professional code-slingers.  Which is cool -- that's a noble rationale, actually.  But it comes back to bite in one way.  See, the folks who adapted its (C++-ish) programming language from the Java-ish "Processing" language aren't really able to support one reality of the modern programmers life.  And that's unit-testing. 

Again, I'm not criticising, because as it turns out, it's sort of a net win.

But first some background/terminology for non-programmers.  "Unit testing" is the act of testing the smallest possible chunks of one's code.  In modern development environments, you tend to write a function hand-in-glove with its test.  The test defines what the expected outcome is for any inputs set.  Then you (or some more automated process) runs the test against your actual code and verifies that the function did what it was expected to do.

(Aside:  Later during the application development process, there are other forms of testing, such as integration testing--did your feature break somebody else's function?--plus performance-testing, and vulnerability scans, etc.  For the purposes of this blog item, we're only focusing on unit-testing.)


In the Arduino world, which relies on the micro-controller taking input from physical sensors, or outputting to physical widgets, simulating that in software would be a boil-the-ocean endeavour.  That's mainly due to ever-growing number of widgets that are compatible with the Arduino.  For a community-supported foundation running on the proverbial shoe-string, that's too much to ask.  And, in any case, I think I can safely say that the community would muuuuuuch prefer to see those limited resources devoted to the core platform.

So for a complex project that involves multiple widgets, one solution is to write your code as separate projects, test it in a more atomic (meaning indivisible) fashion, and roll the tested code into the master project.  For example, my "master" project is a lighting system for the Office Finches.  There are a number of elements that go into that:
  • Two sets of bright white LEDs that are controlled by an integrated circuit because the Arduino itself doesn't have enough "pins," (i.e., input/output ports) for the number of lights needed.
  • One triplet of red, blue, and green lights that can fade in and out.  With 256 possible brightnesses available for each LED, this comes out to over 16.7 million combinations of red, green, and blue.  The idea is to gradually fade these colours in and out to compensate for the lack of full-spectrum daylight.
  • A real-time clock to determine what time it is to turn the lights on in the morning and when to turn them off at night.  (The Arduino, having a fairly primitive microcontroller, does not have an internal clock in the sense that we know it.)
    A passive infrared sensor that is activated (only in the dark) to determine whether one (or more) of the Office Finches has fallen off her/his perch.
  • One triplet of "soft" white LEDs to be used as an "emergency" night-light.  They are gradually raised if motion is detected by the infrared sensor over a few seconds, and gradually lowered when the finches have settled back in.
  • A photo-sensitive resistor to determine whether it's dark enough to warrant the emergency night-lighting.
As I write, I'm testing the red-green-blue lighting.  The "night-light" feature has been tested, as it was the nucleus of the whole setup.  Tomorrow will probably be devoted in part to deciphering some really old (in computer years) documentation for the real time clock and setting that up for unit-testing (again, in a separate project).

Normally, having multiple copies of the same functionality is frowned-upon in software development.  The reason being that it means that you have to remember to deploy bug fixes or improvements across those copies, which costs time for the coders as well as the testers.

But in this case, I rarely modify the "test" copies of the code once they're verified and rolled into the master project.  Which leaves me with bite-sized bits of single-function code that can be used as reference for (or imported lock-stock-and-barrel into future projects.  As long as I name them descriptively and document the functionality (which of course I do--'cuz that's how we roll chez fivechimera), it's All Good.


In this case, what I'd normally consider a "primitive" lack of testing infrastructure turns out to be a net win.  It's another one of those things they don't teach you in Programmer School.  That's not the point, after all--their job is to mold you into a cog that can be plugged into the machinery...at least for the first year or three.  But eventually, you figure out when and where "the rules" can (productively) be broken.  That's one of those career inflection-points where you slough off another bit of your "programmer" skin to reveal the "software developer" underneath.

Tuesday, October 13, 2015

Sax and Violins


Okay, not really.  No violins, anyway.  But Dennis did follow through on his persistent whim of taking up the saxophone.  Last week he picked up a used one and just now is starting to get a feel for the reed and stops.  "What's the 'Stairway to Heaven' of the sax?" I wondered out loud as he was assembling the sax and clipping on the neck-strap.

Neither of us had a good answer.  Which at first surprised me--I mean, doesn't every instrument have its own Stairway?  For instance, the piano has "The Entertainer."  Drums have the solo from "Inna Godda Da Vida"; the harp has "The March of Brian Boru," and bagpipes "Scotland the Brave." But nothing comes to mind (my mind, anyway) as the calling-card of the saxophone.  Except maybe the theme of The People's Court.  Or possibly a Kenny G pastische.  But, then, it's not like I could carry a tune in a two-handled bucket, so what do I know?

But then it occurred to me that, any number of non-musical skills have their own Stairway.  It represents an inflection-point on the learning curve--namely, the spot where the student can start feeling confident about being competent.  Unsurprisingly, a software developer is no different in that respect.  Particularly when the developer can expect to be in perpetual student mode, scrambling up at least one learning curve at any given time.

For instance, in mainframe-based systems (meaning text-only terminals--or, even more retro--green-bar paper), the Stairway app. was something like an accounting report crunched from flat files.  (Mercifully, I haven't had the tedium of [shudder] counting columns, specifying output formats, and fretting about data overflows since Bill Clinton and Jean Chrétien were in office.)

Likewise, back in the days of Visual Basic/C++, an angel got its wings when you deployed a multi-form app. that shuttled data back and forth to some sort of permanent storage.  (Alternatively, outside the Microsoft Universe, it likely involved Lotus Notes.)  As pure client-server topologies went out of fashion in favour of the web, so (mercifully) did this skill-set (most especially debugging your way through "DLL Hell.").

Before 21st century content management systems, having a home-brew collection of HTML pages for photos of your pets, vacation photos, and a mouldering blog was the Stairway of web programming.  Or, if you were coding for a business, it was an online catalogue with a shopping-cart duct-taped on.  (Nowadays, in this age of Wordpress, all bets are off...but if it doesn't involveHTML5 plus jQuery and/or AJAX, you might still be a n00b.)

On the server side, the Stairway once was any database-backed application.  While that's still the bread-and-butter of many developers, although with the emphasis on REST, it has more of a roll-your-own-API kind of feel to it.  Double that if you're providing/fielding data that could be consumed or produced by a variety of devices.

For mobile development, it seems to be the venerable list app. with cloud storage, drag-and-drop functionality, and probably some cool icons for classification.  (Disclaimer:  I'm not beyond the "Hello, Android" phase m'self, so don't take my word for that.)

Database?  If you're using a standard relational database, you should know your way around a stored procedure (with input and output parameters) that punches at least two of the SELECT, INSERT, UPDATE, or DELETE buttons.  Bonus points for advanced use of aggregate functions, conditional sorting, or pagination of results.

And, finally, in the strange world of physical computing, things get real when you have to power at least one of your widgets with a supply that's not the Arduino / Raspberry Pi / Beaglebone / Etc.  Extra credit for using interrupts or making it talk to another device that's not the PC that programmed it.

Obviously, there is a world full of other tunes to play with any instrument.  And now a solar system of things (given that humans sent a camera-spaceship out to Pluto and all) for coders to work at.  But just as musicians don't become (or stay!) musicians without practice, so it is with software development.   (Although, significantly, I have yet to see a title like Learn the Trombone in 24 Hours in the bookstore.  And, of course, there's no such thing as Autotune for programmers.  Grrrrrrr.)

My point is that when you buy an album from an artist or band, you're not actually paying for the individual notes or even, really, the songs.  You're paying for a bit of the inspiration that made them tackle those particular songs in the first place.   You're paying off the equipment.  You're paying for the collaboration of many talents.  You're paying those who mentored them, either directly or indirectly.  And, mostly, you're paying for countless hours of experimentation and error, for "Just one more try and we can call it a day" all-nighters, for head-banging frustration and moments of sheer hopelessness.  In a word, you're buying craftsmanship (and the commitment it requires).

And, although I'm not remotely musical, that sounds remarkably like software development. 

Wednesday, October 7, 2015

Debunking a dangerous meme

While I applaud the publicity for the efforts of a local-ish (to me) researcher to study the patterns of computer hackers, I always cringe when press articles focus on money and personal info.

No question that getting into your bank account is a valuable thing for a hacker.  Stealing your identity is theft one step removed:  The hacker (or thief who buys the info. from said hacker) aims to steal your identity to steal from others.

My point is that it doesn't end there. 

To wit:  You could be flat, busted broke, and have a negative credit score, and hackers will still be interested in you as long as you have a functioning computer connected to the internet.

Here are just a few ways you can still be victimised, even when you think that you are "safe" because you're not Warren Buffet:

Spam - the original flavour.  All the email addresses in your contacts list?  Those can be stolen and spammed.  I'm sure your Mom, your boss, and/or your BFF will all appreciate that...

Spam - now with new and improved Sleaze Factor(TM).   If the hacker (or hacker's client) isn't spamming your friends with dodgy V1@gr@ or Nigerian Prince come-ons, they're trying to trick your contacts into infecting their own computers.  (War story:  The one and only virus that happened on my 2-year SysAdmin gig happened because a normally vigilant someone expected an email with an attachment and double-clicked it.  Bottom line:  It can happen to anyone.  And I mean anyone.)

Spam - the social media version.  If you have a social media account, those passwords can be sniffed and stolen.  Love those sleazy DMs you sometimes get in Twitter or Facebook that are followed by an embarrassed apology from a friend who's just wrested back control over their account?  Yeah, me neither.  Want to be the one making those apologies to your aunt?  Me neither.

Borgifying your computerYou may never think you have a fast enough CPU or half the RAM you could use.  But believe-you-me, you have more than enough power to (unwittingly) help someone mine Bitcoins.  And, holy moly, if you think your computer is slow now...

Borgifying your computer + bogarting your bandwidth.  Remember the distributed denial of service (DDoS) attack that nearly took down the XBox network last Christmas?  That dick move was brought to the world not only by hackers, but by hordes of infected computers (otherwise known as a "botnet.")  Also, remember all that spam we were just talking about?  Yeah, that's likely being pumped through compromised computers as well.  Did Netflix streaming just sputter out?  Oh, your ISP just billed you b/c you went over your monthly bandwidth ration?  Sucks to be you...not to mention everyone else on the receiving end of your computer's shenanigans.

So, it could be just me being cynical about the human race (see afore-mentioned SysAdmin stint), but the whole "I don't have anything to hack" meme is being used as an excuse not to keep computers patched.  And that, even a decade down the road from babysitting networks, just pisses me off.

As much as I despise the codified knee-jerk hysteria that masquerades as cybersecurity legislation, sometimes I wish that people could be legally barred from having admin. rights on their own computers after Computing While Lazy.  Because when our digital lives are eating so deeply into our meatspace time, responsibility comes with the power to instantaneously connect with people all over the planet. 

Tuesday, October 6, 2015

Playing in a different league

Author/Blogger Michael Lopp (http://randsinrepose.com/) claims that, on a semi-regular basis, he crafts a short-list of people with whom he'd found a company.  Then he folds that Post-it into a small cube and swallows it. 

That anecdote actually came up over dinner chez fivechimera last evening, with both Dennis & I brainstorming our respective lists.  How someone makes the list is actually a bit of a balancing act.  It's a matrix, really.  On one axis is the general skills fit--does this person have chops?  That's the easy part.  The other axis is the price of those chops in the coin of personality friction.  (How often am I going to butt heads with this person?  How many people are they going to drive away by being insufferably right at the wrong time in our trajectory?)

Yeah, I know that all non-geeks (plus a healthy percentage of the geeks I know) are nodding, reliving the pain of every socially inept interaction they've had with that person...maybe even those people.  Uh-huh:  The retentive completionist who passive-aggressively drags on the schedule until their pet feature is ready.  The Tamarian whose Rosetta Stone turns out to be Firefly quotes.  The SysAdmin who clearly missed their calling as a black market dealer.  The "idea hamster" who can't execute their way out of a proverbial wet paper bag.  Et cetera.

But unless you're relatively new in the world of work (or you're extremely unlucky), you know a few who make the cut.  And if you're not as lazy as I am, you keep in touch.  This morning's spike in my LinkedIn traffic was not a coincidence.  Which, to my chagrin, prompted a comparison between last night's brainstorming and fantasy sports leagues. 

Mind you, it's not an entirely frivolous exercise, because it inherently forces one to acknowledge one's limitations (in time and talent).  For example, here's my particular fantasy roster (names omitted):
  • Network/IT Support - Yes (with a second-string backup)
  • QA - Yes (also with a second-string)
  • Graphics and Design - Yes
  • UI/UX Developer (web and mobile) - No
  • Tech Support - Yes (with multiple levels of backup)
  • Project Management - Tentatively yes
  • Sales - Nope--and that's the 363.64-kg gorilla of my (hypothetical) staffing problems
But the uncomfortable fact remains that until I have a working prototype with which to pitch the team (to say nothing of potential clients and/or investors), my roster in the Fantasy Startup League is just that--fantasy.  It's not out there, sweating and bruising itself on the ice or the astroturf.  It's not even in the weight room or breaking down video of last week's key plays/misses.  Nope--it's sipping a beer in its armchair, shouting off-the-cuff opinions (which may or may not be founded in reality) at the screen.

The folks out there who are fighting fires, losing sleep, duct-taping, putting the "work" in "networking," rolling the bones--all the while listening for the pacing of the wolf outside the door?   No matter how small the company, they're playing in the big league.  And don't think that I don't understand the difference.

Friday, October 2, 2015

Machines and meatspace

In internet parlance, the term "meatspace" refers to the non-networked world.  You know, good, old-fashioned reality reality--that messy place where we have to look people in the eye and remember our manners as we talk to them.  Programmers, particularly those of the database/back-end kind of coding, too often have to be reminded of its existence...at least after the business rules have been codified in the software design/specs.

Don't get me wrong--even programmers who don't write web pages or phone apps. still have to contend with real-world constraints.  For instance, on my last project, I ran up against the limits of a fairly slow processor on a cloud server.  

Now, cloud hosting companies, as a rule of thumb, keep their hardware in matched sets.  That keeps everything standardised.  It's not an OCD thing--the apples-to-apples parity means that they can scale servers up and down by adding & removing processors and blocks of memory almost on the fly.  Upgrades that would have taken days even fifteen years ago now can be done in a matter of minutes with mere seconds of downtime.  That's a Big Deal.

But for all the benefits of hosting in a virtual private server environment, it does come with a downside, and that being that the hosting companies have a vested interest in commoditisation, which generally pushes them toward the low end of performance.  In most instances, throwing more hardware at the code is all you need.  After all, industrial-strength software such as the Apache web server or the MySQL database will spread out its requirements over multiple CPUs. 

But woe betide the programmer crunching hundreds of thousands of records with a lowly script.  In my case, the PHP executable can only use one processor at a time.  In which case, 1 + 1 does not necessarily equal two.  Simply because processing data in parallel means that separate processes are in constant danger of stepping on each other's metaphorical toes.  So some extra protections (lock files, record flags in the database) had to be spliced into the code as insurance that, say, the same record wouldn't be processed twice.

But as I've side-stepped into physical computing, limitations are hammered home even more forcibly.  For instance, the Arduino and the Raspberry Pi only have a limited number of pins to work with.  Not to mention considerations like voltages (5 vs. 3.3 vs. ???) and number of milliamperes that can be squeezed through them.  And now, picking up 3D modeling (in the form of OpenSCAD), I even have to take pesky things like gravity seriously...at least for anything that's going to be printed in plastic.  (Which is everything--CGI isn't my bag.)  See, printing plastic into thin air doesn't work so well.  And that the tolerances aren't completely perfect.  And, most importantly, that saving my notes from the Geometry/Trigonometry classes that I took a decade ago was a really good idea.

I suppose that there's a silver lining, though.  As noted earlier, the vast majority of the code I've ever written is the below-the-waterline part of the iceberg.  So having to respect the limits of things besides database normalisation and Boolean logic and all that is good for you.  The same discipline applies, after all.  Code is written to be re-used.  Functions/modules are documented.  Input is validated (where possible; OpenSCAD doesn't provide exception-handling).  Code is checked into source control.  Alas, there's no such thing as unit-testing, but you can't have everything, I guess...

Tuesday, September 29, 2015

Hating on Physics

Ugh -- two hiccups in one morning.  (Mercifully, there was just enough coffee left in the thermos for them both.)

Firstly, a rough head-count of wires suggests that we may not be able to get away with a mini-breadboard for the MPL's robot -- at least, not for the "brains" part up top.  Which means more weight.  And, more annoyingly, a chassis redesign.

Secondly, a client asked me to verify that data had been imported by a scheduled job.  Alas, when I peeked into the database, I found that only one record from roughly 800 had made it over.  Going back to the original data dump (like y'do), I quickly realised that the code was fine, but all but one record was seriously hosed thanks to a freaky Excel export.  (In programming, the shorthand for this type of situation is "GIGO":  Garbage In, Garbage Out.)

The fact that the code is, technically, doing what it's supposed to do is cold comfort.  The fact remains that we have another edge-case to take into account (and gracefully handle, natch'erly) in the next release.   Which means time and budget that doesn't go into what we want to accomplish.  Booyah, legacy code.

So, as a human -- particularly a human who works from home -- I'm allowed to say a few unprintable things.  But I'm not allowed to resent the newly-discovered edge-case.  That would be as ridiculous as hating the physics of not enough breadboard wires for all the robot-widgets.  And I'm certainly not allowed to carry on as if those problems didn't exist.  Or worse, go solve the problems that interest me.

See, that's what bugs me so much about wanna-be "leaders" like The Donald or Carly Fiorina.  Forget the incendiary rhetoric.  It's when the former responds to a hard, documentable fact (e.g. that 40+% of illegal immigrants arrive in the U.S. via plane, not sneaking through the desert) with "I don't believe it."  Go pound sand, Trump:  Reality doesn't care what you believe. 

Likewise, the latter doubling down on her claims about the breitbarted "Center for Medical Progress" video, inventing footage that never existed in the original scam.  (I thought that fanfic had reached its nadir when Twilight fan E.L. James penned the 50 Shades trilogy, but...daaaaaaaaang.  That's a whole 'nuther level of gaslighting your own gender, Ms. Fiorina.)

Gosh, I can't imagine how one's been in bankruptcy (at least) four times, and the latter very nearly led her company off a cliff... [insert uber-sarcastic eyeroll]

Look.  I don't get to ignore reality--much less hate on it.  That would lead to solving the wrong problems...assuming they're even problems in the first place.  At best, that's a waste of time and money.  In the middle, the real problems are neglected.  And in the worst-case, that additionally creates new problems.

I assume that, like me, my Gentle Reader is paid to solve real problems within the constraints of the real world.  We should accept nothing less of ourselves.  And expect even better from those who want to have the power to declare war, spend our tax dollars, etc.  Values can be debated; facts cannot.  Anyone who cannot accept that should have their "adult card" taken away and be given a Sims account so they can live in their own reality without bothering anyone else's.

Wednesday, September 23, 2015

An experiment in learning by teaching

Originally, my contribution the geeky "potluck" that is the 2015-2016 season's Moncton Developer User Group presentations was supposed to be the Laravel PHP framework.  But then I volunteered to help the Moncton Public Library as an Arduino coder on a robot project and found myself having to learn 3D modeling for the parts that Thingiverse doesn't have.

Sadly, my attempts at drawing anything (in two dimensions) almost make xkcd look like Rembrandt.   Okay maybe you'd have to go to The Oatmeal or Hyperbole and a Half for that comparison.  But still.  Then I discovered that the OpenSCAD software was based on describing a 3D object mathematically, rather than drawing it (e.g. with SketchUp, AutoCAD, or Blender).  Believe-you-me, my pointy ears perked up and my spider-senses quivered.

Most software packages, unless they truly suck (or you hate having to work with them for other reasons) come with an infatuation period.  As with people, it lasts until you expect one behaviour and get something different.   But in the headlong rush of dewy eyes peering through rose-tinted spectacles, I pinged the MUG illuminati with the idea that OpenSCAD might perhaps be more interesting/useful to our crowd.  (For those outside I/T, PHP is a workhorse of a programming language, which makes it boring...except for the people who live to hate on it.)

So I spent the better part of a week's bedtime reading on the official documentation.  To help myself structure the information, I started outlining the information as I understood it.  Essentially, it was the nucleus of the outline for my presentation.  Unsurprisingly, the outline has since been re-arranged, split up, re-grouped, etc. as my understanding has become more completed.  And the obligatory "Gotchas" section has grown.  And moved up in priority as I've actually used OpenSCAD to model robot parts...among other things useful to the household of a budding evil mad inventor.

As auto-didactic techniques go, I'm going to hang onto this one.  For me, it's useful structure.  For the eventual recipients of the presentation, at least they can rest assured that it hasn't been thrown together the hour before.  Win-win, right?

Tuesday, September 22, 2015

Shaving the zebra

(Not the same thing as yak-shaving--not at all.)

Because I have to work in two different worlds (namely, Linux and occasionally Windows, although not yet Apple's walled garden), I've saved myself a lot of cranium-brusing by investing in a KVM switch.  This device allows you to toggle between multiple computers using the same monitor, keyboard, and mouse.

I had originally pulled apart my office for painting, then had to re-connect my main programming PC (Linux) to do some troubleshooting.  The monitor remained blank.  Due to the urgency, I bypassed the KVM switch for the duration, figuring I'd debug later.  "Later" came tonight, when the monitor was still unresponsive.  After verifying the spaghetti of cabling (and switching to another port, and testing another monitor), I hollered for Dennis to sanity-check for me.  He did some poking and prodding, but couldn't find anything amiss.

So I dissembled the KVM wiring and again hooked the peripherals directly into the PC.  Again, everything but the monitor seemed to come online.   So back under the desk I dove, and discovered what both of us had missed--namely, that this particular model has not one, but two DVI (i.e. monitor ports).  We'd been plugging things into the top one (supposedly interfaced with the motherboard, but not really) rather than bottom the one attached to a big, honkin' (like, early 1990s size) video card.

There's a folksy adage that goes, "When you find a dead man with hoof marks across him, look for a horse before you go looking for a zebra."  I'd like to report that I learned this at my Grandma's knee.  But the fact is that I picked it up from the only episode of Doogie Howser I can ever remember watching.   Anyway, the logic is merely a variation on Occam's Razor.  Problem is, we tend to scan for things top to bottom, and, well, why would you continue to look for something you've already found?

The experience illustrates why, despite it being more or less the foundational to scientific reasoning, we can still cut ourselves on Occam's Razor through laziness or over-confidence.  Because if the afore-mentioned dead guy is found on the plains in Africa, the familiar assumptions become useless--even counter-productive. 

I suppose that's the point of the celebrated "Five Whys" of the Toyota Production System:  It forces debugging beyond the immediate and superficial. 

As it turns out, the KVM switch is still hosed.  Naturally, I only learned this only after hooking up everything through it again.  #mumblegrumblemumblegrumble  But at least there was more certainty in the debugging this time around (verifying with a laptop whose video output the monitor likewise ignored).  I'll ping Matt at BJW Electronics tomorrow to see whether repairs are even an option in this case.  I fervently hope so, and not just because I dislike adding to the landfill:  KVM switches are bloody expensive.

Monday, September 21, 2015

Back from "staycation"

Shocking precisely no one (except a rookie developer or rookie developer's client -- which we are not), the limited Beta had rough edges.  Those were filed off while I've been racing the calendar with painting (most notably my office).  And learning 3D modeling (more on that later) by the tried-and-true method of figuring out why my code blew up.

But the office is going back into shape this evening (assuming I can gang-press The Other Programmer In Residence's longer, stronger arms for fifteen minutes)

Blog resumes tomorrow.

Monday, August 24, 2015

Innovation iconoclasm: Beyond the cult of the start-up

I stalled out about a quarter of the way through Geoffrey Moore's Dealing With Darwin, which has nothing to do with the quality of the book (which is excellent), and everything to do with my ability to be distracted.  Now I'm picking it up again as my "nightcap" reading. 

It kind of hit a nerve when I learned that the office where I spent the best years of my career has been broken up into functional "pods" (for lack of a better term).  That's on the heels of a shake-up that saw a spike in my LinkedIn social circle.  Zo wellz--the silver lining is that it spares me the risk of nostalgia.  I mean, yeah, I can be nostalgic about the days when 20 or so of us were proudly referred to by the boss as "The Island of Misfit Toys."  But we were also working on a scrappy new bet-the-branch-office product then.  That takes a particular alchemy--not mere chemistry, which is far too predictable--of personalities to pull off.

And then at some point, you wake up and find youself and your product find yourself in a more mature market--which is a whole 'nuther game.  That's what Dealing With Darwin is about.  And probably why it's overlooked on most business reading lists.  After all, Moore (and his colleagues) are known for the consulting work that preceded and spun out of Crossing the Chasm.  The latter focuses entirely on growing a product market from the adventurous early adopters to the more sceptical mainstream (by bridging the gap between their very divergent needs).

That early stage company bringing something brand-new to a market with no mental map for what they're making/selling is what (nearly) everyone associates with "innovation," right?

Darwin, however, hammers home the much-neglected truth that there are more species of innovation than the brand-new product.  Things like the following require "innovation":
  • Adding new features to an existing product (e.g., a camera to a phone)
  • Making an existing product do more with fewer resources (e.g., lower-power computer chips)
  • Tapping a new (unexpected) market for an existing product (e.g., Viagra was originally a failed treatment for high blood pressure)
  • Up-scaling an existing product/market for higher profit-margins (Starbucks, Apple Computer, Whole Foods)
  • Streamlining and standardising supply chains and work-flow (e.g., Ford Motor Company, McDonald's, Dell Computer, etc.)
  • Re-tooling work-flows and supply-chains to emphasise quality and reduce the cost of mistakes (e.g. Toyota Motor Corporation)
  • Reducing transaction friction/overhead with the end-consumer (e.g., Zipcar, Netflix)
  • Abdicating responsibility for labour and safety laws by declaring your employees "contractors" and yourself a "technology platform" (e.g. Uber, TaskRabbit)
  • Itemising core services and adding surcharges for them (e.g., airlines, banks)
Okay, so maybe those last two are a little bitey, but that doesn't change the fact that there's a certain level of (ahem!) "creativity" in them, not to mention a relentlessness in execution that has--regrettably--made them the new normal.

The point is that established companies can't rest on their proverbial laurels.  Any good idea will have copycats--some better and faster than others--and the "first to market" advantage has a limited shelf life.  After that, it takes management discipline (and probably no small amount of luck) to stay ahead.  Which will yield higher returns--investing R&D dollars into making a better product, or into reducing costs?  Or maybe (just maybe) is it time to start exiting the race to the bottom and bring that skunkworks project into the light of day?

Those are not easy questions to tackle, particularly with all the baggage and politics of an established money-making track-record.  While individuals may too often throw away tangible good in pursuit of phantoms, organisations are not so often guilty.  Maybe it would easier if we'd more readily recognise  innovation when it wears business casual in Toronto instead of just a Red Bull-stained hoodie in Silicon Valley. 

Friday, August 21, 2015

Frivolous Friday, 2015.08.21: Do it for Darwin

There are currently two categories of Darwin Awards.  There's the kind where the recipient has removed himself or herself (but mostly himself, as it turns out) from the gene pool.  And there's the Honourable Mention--a.k.a. still dog-paddling in the gene pool. In the midst of the huffing and--let's face it, pure schadenfreudish glee--of the Ashley Madison hack/leak, I can't help but see a Darwinian window of opportunity.  Humour me for a minute and hear me out, because I'm sorta-kinda-maybe serious here.

The thing that struck me the most about the news--okay, okay...the thing that struck me after I stopped smirking over the completely unsurprising revelations about Josh Duggar, Jason Doré, and Ottawa, Ontario--was how many blithering, mouth-breathing, knuckle-dragging idiots used "real" email addresses to register an AM account.  Like, email addresses that their spouses would recognise.  Or, more stupefyingly, their work email addresses.  Seriously, this is the level of stupid that warrants having one’s email privileges revoked.  Possibly for life.

Attention, I/T Managers of the world:  This is your moment.  

Having clocked in two years of desktop support and babysitting servers, I feel your pain.  And I was one of the lucky ones--we only had one virus on my watch, and that (mercifully) not even a Zero Day.  But I've heard your war-stories.  And I think we all intuitively recognise that the Ashley Madison species of idiot shares the majority of their DNA with the type of co-worker who's habitually guilty of:
  • Installing malware (e.g. Napster, browser widgets, fake “virus scanners,” etc.) on their workstation #truestoryaboutnapsterbythebye
  • Double-clicking any (and every) email attachment that lands in their Inbox
  • Leaving files open (and thereby read-only to everyone else) on the shared drive
  • Forwarding chain-email
  • Adding misspelled words to the spell-check dictionary because they’re convinced that they’re right and Microsoft is wrong
  • The Dreaded Reply-All ('Nuff said, amirite?)
  • Font crimes (Again, ‘nuff said.)
  • Being congenitally incapable of grokking the difference between CC: and BCC:
  • Likewise, being congenitally incapable of printing double-sided copies of anything...or even switching their default printer during repairs
  • Storing Every. Single. File. (and program shortcut) on the Windows desktop
  • Using the same password for every system...and storing it on a Post-it taped to their monitor
  • Insisting on access to systems/reports they never actually use
  • Calling support because their workstation is "slow," only to have support arrive to find Facebook open and the browser scrollbar a millimetre tall
  • Passive-aggressively weaponising the issue-tracking and/or project management system
I'm sure other SysAdmin veterans could expand the list in hilarious and/or toe-curling ways, but I think that I've established my point.  That point being that I/T folks now hold in their hands the power to effectively chlorinate the office gene pool--perhaps even for a generation.

I/T comrades, for the love of those of us who don't drag you away in the middle of critical system rebuilds (or at least playing FPSes for hours on end to, uh, "gather performance metrics" on said systems), get ye to the darkwebs! And bring thy RegExp A-game with thee!  Scan those lists!  I can (almost) guarantee that if you find your company domain(s) among them, the correlation between those AM sneaks and your laziest Luddites, your most slovenly sociopaths, your most vapidly venal office-critters will be statistically perfect.

In which case, you know what to do.  Take screenshots.  (Bonus points for correlating them with router logs.)  Send them to every single printer in the building and "forget" to pick them up.  Then wait for the radiation fallout to stop glowing, and you'll find that many of your work-day problems have taken care of themselves.  And, more importantly, you will have done everyone who can be trusted with a computer a huge solid.

My I/T brothers and sisters, it's time to step up to the plate.  This kind of opportunity doesn't surface often.  Personal codes of ethics aside, I think we can all agree that using your work email account to cheat on your spouse or significant other shows a breathtaking lack of professionalism--quite apart from the sheer sleaziness of it all.  Yeah, I don't want to work alongside that, either.

Monday, August 10, 2015

All networks are not created equal

I'm not knocking La Crosse, WI (more accurately, French Island), but we had pretty sorry luck in neighbour relations.  On one side, the summer weekend evenings were spiced by the Four Letter Badmitten Olympics.  On the other, a couple who couldn't wrap their brains around how anyone could find their beagle's 30+-minute barking jags annoying.

I figured that the lack of camaraderie was largely our karma:  We tend to keep ourselves to ourselves, and my West Coast work schedule probably didn't help with those random encounters that cement those relationships bit by bit.

Grande-Digue has been rather different, pretty much from the get-go.  The residential lots in these parts tend to run larger than the quarter-acre of our French Island subdivision.  Yet, almost counter-intuitively, the bonds tend to run a lot stronger.  Part of it is the Acadian nature of the place--generations of Poirers, Fougères, Bourgeoises, Legers, LeBlancs, Robichauds, Melansons, Gallants, Cormiers, et. al. growing up like an extended family.  But apparently, not so tribal as to shun arrivistes like ourselves.

But even after three years and change, I'm still sometimes caught out by how tight-knit the place is.  Today was one of those days.  For instance, Post Canada left a delivery notice for a package that required a signature...which I promptly left at home.  I rarely see the lady who takes care of late-afternoon customers, so I pulled out my driver's license to prove that I indeed matched the addressee on the pack.  She waved it away with a smile:  "I know who you are."

An hour or so later, the neighbours for whom we've been cat-/chicken-sitting stopped by with a thank-you gift from their weekend in Quebec.  In the context of chatting about their travels, I casually mentioned that we're considering a jaunt to Newfoundland.  Turns out that because we also mentioned it to another neighbour, it was already old news.

I should have known.  In both cases.

And so, while the social media mavens obsess over the magic n-squared-minus-n-all-divided-by-two formula, it's wise to remember that the strength of connections trumps the number of nodes.  At least when you're actually interested in making more than a superficial impression.  That's the way most folks around here seem to want to roll.  That's cool by me.

Saturday, August 8, 2015

Cynical Saturday, 2015.08.08: Meowtivation to stay focused

For the second time in a handful of weeks, we're looking after our neighbours' chickens and cat while they're out during normal feeding time.  Dennis is the Chicken Whisperer of the household, so I'm on cat detail.

My sister (wisely, in light of years of sibling warfare) rarely ever asked me to babysit her three kids, so I still have some of that whole "being a corrupting influence" thing to work out of my system.  Brioche (the afore-mentioned cat) isn't yet an adult, so she's a good candidate.  Starting with the joys of the laser pointer.

She's crossing over from kitten to adolescent, currently in that phase when the ears, tail, and feet are waiting for the rest of her to grow into them.  And in that time, her hunting style has changed.  Kitten Brioche would tear pell-mell after the red dot, braking too late and sliding across the floor and into a chair.  (Ooops--I made sure that didn't happen again.)  Teenage Brioche (mere weeks later) slinks in the cover of shadow when possible, advances on silent paws, and darts in short bursts so as to easily pivot.  Moreover, I think she may have started to clue in:  At one point, the dot disappeared from her view, and while I was attempting to manoeuvre it back, she intently watched the hand that was holding the laser pointer.

One thing that hasn't changed, however, is the cat's preference for the dot over a string.  As an experiment, I dangled the string tonight, let her catch it, and while she was wrestling me for it, shone the red dot in her line of sight.  She immediately abandoned the physical "prey" already in her grasp for the flashy, infuriatingly elusive one.  And I thought, "There's totally a metaphor in here."

Like an unlucky alignment of the planets, Thursday evening was Debate Night for both Canada and the United States.  In the U.S., Fox News showcased the top ~60% of the candidates they will endorse for the next sixteen months.  In Canada, the leaders of the Conservative, National Democratic, Liberal, and Green Parties held their first debate for the national elections coming up in October.  (Shocking precisely no one, none of the Canadian candidates threw verbal molotov cocktails--either during the debate or the following day.  Ahem.)

In short, there was a lot of politics in the air.  And if internet forum comments are anything to go by, the similarities between (some) humans and Ms. Brioche can be marked.  Particularly in the way they can be manipulated to ignore the hand holding the string or laser-pointer.  But even more particularly in their tendency to abandon working toward tangible goals in favour of chasing phantoms.

On both sides of the border, roads and bridges are a mess.  Health care systems are top-heavy.  Educating the next generation too often devolves into political spit-balling.  (In the U.S. it's passing off fairy tales and wishful thinking as "science"; in Canada it's segregating French and English as if they'll somehow contaminate each other like it's 1066 all over again.)  Corporate welfare needs to end--period.  The tax code punishes people for earning a living rather than siphoning record profits off an increasingly outsourced economy (esp. in the U.S.)  The only plan for the coming grey boom is importing young people from other countries.  (Which seems particularly counter-productive when automation and relentless off-shoring will increasingly steepen the ratio between workers and available jobs--another fact that no one seems to have the guts to look in the eye.)  There is far, far too little daylight between the regulators and the regulated in industry.

There are no shortage of fires against which to hold our elected leaders' feet.  And they will--regardless of party--try to sidetrack any issue that affects their campaign contributions.  The current political machinery is geared to produce outrage.  Outrage is not necessarily a bad thing, if properly channeled.  But never, EVER follow "leaders" who encourage you to punch down, not up. 

Like the "leaders" who waste millions of taxpayer dollars drug-testing welfare recipients to catch a mere handful.  Like the "leaders" who shut down early voting, impose ID laws, and make people stand in line for hours to vote to stamp out fraud that was a statistical nullity to begin with.  Like the "leaders" who amend constitutions to prevent same-sex couples from enjoying the rights of their hetero neighbours.  Like the "leaders" who threaten, insult, or dox anyone who dares criticise them in front of God-n-everybody--thereby encouraging their posse to do the same.  Like the "leaders" who can slash education budgets to punish those "union thug" teachers while building sports stadiums for prima-donna athletes.

That sort of thing is the mark of the red dot.  Mercifully, Canadian voters have only about two more months before the election; we U.S. voters have sixteen moons of manufactured horse races, kerfuffles, ineffectual-to-non-existant fact-checking, bigotry, Breitbarting, grand-standing, and Borgia-class character assassination ahead of us.  (Anyone up for a betting pool on when Trump rage-quits, rage-returns, then rage-quits again?)

Those sixteen months will take focus.  But we're all smarter than cats:  We know better than to waste our energies chasing the red dots that are wriggled in front of us.  Riiiiiight???  Let's not get distracted by the flashy stuff.  We have serious work to do in the ballot-box.

Monday, August 3, 2015

The shelf life of books

Part of the problem with graduating as a History/English double-major is that you've had years to get used to stuff in print staying passably fresh.  Like the sour cream in the 'fridge that should be primordial soup by now, but still passes the sniff test.  Written history and literature may be reinterpreted, and every great once in a while new extant material is discovered.  But the words really don't change.

Yet, although my college education has served me reasonably well in the past quarter-century, the above is one mental bad habit I haven't been able to ditch.  But this past week brought it home forcefully that I was, to a certain extent, spoiled for four years.

For reasons I can't now recall, the leftover (read: oddball) bookshelf that ended up in our bedroom was mostly stocked with programming-related books.  But then I needed to move them to paint the baseboard and walls, which forced me to take a hard look at a collection that would have been better triaged before the big move in 2011.

Some things are timeless--at least as timeless as they can be in I/T.  Give up my copies of Kernigan and Ritchie's C Primer or Bjarne Stroustrup's The C++ Programming Language?  Not on your tin-type.  Alas, the years have not been so kind to other tomes, covering (among other things):
  • Visual Basic 6
  • C# 2010
  • Adobe Flex 4
  • Java I/O (circa 2000)
  • Java Servlets (circa J2EE 1.1?)
  • Microsoft Visual InterDev 6
  • T-SQL querying for SQL Server 2005
  • Java2 Certification prep.
  • ASP.NET MVC 2 
  • Two Java books I remember buying when I started my I/T career (1999)
In fairness, Dennis still does enough work in classic ASP that my 2001-vintage reference is still relevant to his work.  But otherwise, owch.  What in the noodley name of The Flying Spaghetti Monster was I even thinking when I packed some of these?

Maybe, in a perfect world, Sally Annes would happily collect these books to send to programmers behind the Iron Curtain who were still working on Windows 95.  Scratch that--a perfect world would not include the Soviet Union.  Okay, maybe those programmers are really Portlandia hipsters coding IE5/Netscape apps. ironically.  Or something.

But the harsh reality is that these language versions will--unlike even bad fiction--return to vogue, except possibly for 2038 remediation work, and I'm not about to guard them for the next 20 years to save the world from Unixmageddon.  (Sorry, civilisation.)

Now I just have to bring myself to dispose of them responsibly.  Being the kind of bibliophile who's still outraged over the Library of Alexandria, I'm not sure I can consign them to the wood stove this winter.  (Besides, that VB6 book is like, five inches thick.)  Pity we're not into vermicomposting--we'd be set for bedding for quite some time.  Not to mention that we'd have the nerdiest worms in the province.

Thursday, July 30, 2015

A funny thing happened on the way to modern English*

One more tidbit from the (ongoing) deep-dive into ancient/classical Rome.

Latin and modern English pretty much agree on a broad definition of "patron."  A patron looks out for something or someone.  hen a customer patronises a business, s/he is interested in keeping it going--perhaps when there might be cheaper and/or more convenient equivalents.  In the days when kings and nobles sponsored writers and artists, they were considered patrons.  (Of course, a rising middle class meant that patronage tended to become more democratic, but the basic tenet still held.)

The resemblance ends there, however.  The definition of patron (and the relationship between the patron and those he patronised) changed a bit over the centuries, but the basic idea ran more along the lines of reciprocal obligations.

Oddly, the patronised were called "clients," which sounds strange to modern ears, because the client in what was considered a socially inferior position--and, by definition, held less power.  It doesn't map cleanly to the later feudal obligations of serf -> minor nobleman -> major nobleman -> king.  But it was knit tightly into the social structure and roman ethos of trustworthiness.

A patron's obligations to his clients, depending on the time, could include financial support, brokering marriage arrangements, representing his client in court, use his influence to shield the client from the tax collector or to right some wrong done to the client.

The client would (most importantly) vote for his patron.  But (again depending on the century), he might go to war with his patron.  Or provide what services he was capable of performing.  He might contribute money to the patron's efforts (political campaigning or religious services).  If the client was powerful enough to have clients of his own, those sub-clients might be put to the use of the patron.

The eye-opening facet of this relationship was that it was binding on one's heirs.  In essence, it became quasi-familial.  Patrons and clients could not testify against each other in court.  If a client died without an heir, his patron would inherit his property.  That freed slaves automatically became clients of their owner was no surprise, but people conquered by a Roman army would (again, in certain points in history) be considered "clients" of the commanding general.  Or the Emperor (which was sometimes one and the same).

Today, the only echo of this relationship can be found in the attorney-client / doctor-patient / confessor-sinner relationship.  All are to some extent protected by law and professional ethics in a way that other business relationships aren't.

(Of course, in I/T, "client" has another meaning entirely.  But rather than a client-patron relationship, it's a client-server relationship, which is where our definitions really go off the rails...)

When you dabble (as I do) in etymology, you're used to meanings outgrowing their words, or sometimes shrinking within them.  And sometimes the meanings almost turn themselves inside out.  As a business owner, I thought it was interesting how the meaning of client transitioned from one who relied on someone more powerful (sometimes for their sustenance) to the person who really holds the power in the relationship.  Thus, client and patron have become almost equivalent in meaning, at least from a business's standpoint.

- - - - -

* A wink to the Zero Mostel vehicle.

Monday, July 27, 2015

The Roman(ce) of organisation

I was a little under the weather a few days ago--just enough to take the edge off.   Sometimes the internet (by which I mean being sucked into the ADHD maelstrom of social media or the news) is the wrong kind of distraction.  Instead, I ended up going down a different (albeit rather more focused) rabbit-hole:  Roman history of the classical period.

Dennis is fascinated by Roman history.  But, then, he's more or less the military historian of the household.  And, let's call a spade a shovel here--the Roman empire was basically organised around conquest more than trade.

A lot of noise is made about the proverbial bread and circuses (particularly by the libertarian right) being the eventual downfall of Roman civilsation.  That's far, far too facile.  Bread and circuses?  Really?  When the military and the government come to operate hand-in-glove?  And the military largely consists of large private armies loyal to a single leader?   And the empire has to invade new territories to pay off the soldiers recruited after conquering those soldiers' territory?

That's basically a Ponzi scheme--with all the hallmarks of a tin-pot dictatorship besides.  We all know how those schemes--and sometimes even the dictators-- eventually end.  (Shocking precisely no one, Rome once had four Caesars in a single year.)

Plus, Rome's dependence on slavery--on average, one in three residents was a slave--gave little incentive for advancing technology.  Plus, the slave trade was one of the more lucrative aspects of war-mongering.  Which, of course left even less incentive for inventing labour-saving devices. 

Now.  I'm not saying that Dennis isn't correct.  The Romans were straight-up fascinating.  Most notably for their organisational skills.  (To me, the obsessive organisation of the Roman armies is far more interesting than tactics, battles or campaigns.  Making vs. breaking and all that...)

After all, it's not like war-mongering made them unique in the ancient Western world (or probably any other place and time in human history, really).  Neither did slavery:  Romans were rank amateurs compared to, say, Spartans  Nor did religion--theirs was a mongrel mish-mash based somewhat on the cosmology borrowed from Greece, but with plenty of Etruscan leftovers and cults imported from as far afield as Persia (modern Iran).  (And that doesn't even count those pesky mono-theists like the Hebrews and Christians who showed up in the middle of the story.)  Culture?  Nope--classical Rome had a decided inferiority complex:  "Real" intellectualism came from (or imitated) Greece.

But nonetheless, time has been kind to the Roman legacy in the Western world.  Take away the letters "J," "K," "U," and "W," and you have the Latin alphabet.  (Mercifully, however, their numbering system has largely been left by the wayside in favour of the more sensible Arabic one.  Want to know why Star Wars takes place "a long time ago"?  Because they're using Roman numerals for the episode numbers.  Just sayin'.)  Until a couple of centuries ago, their language was still the lingua franca of the educated elite.  Five modern languages are derived from Latin:  Spanish, Portuguese, French, Italian, and Romanian.  (And let's not forget that roughly one in three English words derives from French.)  The Western calendar (with some corrections and innovations like the leap year) is largely their legacy. 

Part of that is, of course, the legacy of Empire--particularly one that stretched from the middle of the U.K. to Istanbul.  Flexibility in adapting to local customs and politics is a virtue, but flexibility in language and standards (e.g. measurements, dates) is the short road to administrative suicide.  On the flip side, in a world where people are used to squabbling with their nearest neighbours from time immemorial, adopting the systems of the Empire means that no one actually has to compromise.  (Human beings are awesome, amirite?  [eyeroll])

In a milieu where even the longest-lived empires (e.g. the classical Greeks) didn't survive much more than a few hundred years, the Romans--despite the serious flaws in their civilisation--could hold out for roughly a millennium.

And yet today, "organisational skills" is such a throwaway term--the kind of fluff one puts on a resume or emphasises in an interview as part of the ritualised employment dance.  As a liberal arts graduate, I've had a quarter-century to shake my head over how "communication skills" are treated much the same way--universally demanded, but rarely valued.  It was only after reflecting on the classical Romans that I realised that talent in organisation falls into the same category. 

In that light, I suppose it comes as no surprise (in an age of rock star CEOs) that we all know who Julius Caesar is, but take for granted the much larger legacy of often anonymous census-takers, accountants, lawyers, tribunes, consuls, engineers, scribes, et. al. who gave cohesiveness to what otherwise might have been a flash in the pan based on a handful of military victories. 

Monday, July 20, 2015

Beware the fair-weather meteorologist

Today, as a favour to a client, I spent about an hour on the phone with a consultant who was helping him be reimbursed for some of the cost of "our" application via a government program.  The aim of the program is to promote research and development into new products/designs...even when those things don't necessarily pan out.

This app. certainly qualifies in that regard:  At the start, we had no idea that it would even work.  Two years and a few significant adjustments later, we're still rolling the bones on each iteration.

My part in this interview was to basically tell the story of each stage in the application from a boilerplate format something like this:
  • What was the problem we were facing?
  • What did we do to address the problem?
  • Did it work according to expectation?
  • If so, by how much?  
  • Or if not, by how much?
  • What unexpected problems/obstacles (if any) cropped up?  
Pretty straightforward stuff.  Particularly that last part.  Because no software design ever survives its contact with the real world unscathed...assuming it survives in the first place.  But I was caught a bit off-guard by the interviewers admonition to tell him all about the problems, even (as he put it) the kind of thing that I wouldn't have told my client.

Uhhmmmm, there's no such thing as those kinds of things.  I only sit on problems long enough for one of the following to happen:

Option A:  I have (at a minimum) a strong gut-level diagnosis plus a plan to verify/disprove that hypothesis.

Option B:  No immediate diagnosis is forthcoming, but I know exactly where I'm going to start looking.

Arriving at either option shouldn't take very long.  Half an hour, maybe.  Basically, experience/instinct kicks in...or it doesn't and my Google-fu skills get a workout.

But that's not really the important part.  What's important is choosing to work with clients who understand that freaking out at problems only makes them harder to solve.  Not only the problem(s) at hand, but future problems as well.

Sure, freak-outs guarantee that you (as a client) won't hear about a lot of the small problems--at least not the kind that are easily and quietly fixed.  Maybe you'll sleep better for that.   At least until the less-easily-fixable small problems fester into big ones.  All because energy that could have been spent on diagnosis is instead channeled into suppressing the symptoms.  (And let's not even count the time wasted on blame-slinging and finger-pointing.)  I have no sympathy for a client relationship run on a "no bad news" principle.  It's tantamount to  only accepting sunny forecasts from the meteorologist.  Or buying stocks from perpetually bullish brokers.  Die of pneumonia in the poor-house, and who's to blame, really?  Exactly.

Yes, I realise that no one likes to report (much less hear about) project line-items slipping a schedule date, or costing more than initially budgeted.  Or that off-the-shelf software will require more customisation than advertised.  Or that a Zero-day security hole has to be patched and rushed out the door in the middle of a release cycle.  Or that Google/Facebook/Bing/Microsoft/Apple/PayPal/Etc. has arbitrarily changed their terms of service.  Or that a massive DDOS attack is measurably impacting an app.'s response times.  Whatever.  Stuff happens.  Professionals deal with it.

In lieu of falling into a "no bad news" mindset, here's an alternative.  You (again meaning the client) can't insist that there will be no bad news.  At least without everybody knowing you're on holiday from Reality.

But you can--within reason--insist on no surprises.  Obviously, things like security flaws, third party service failures, etc. aren't always predictable. (Hence, the "within reason" qualifier.)  Developers, you can insist on a working in a No Freak-out Zone as the price of landing no surprises.  (Pro tip:  If you're not sure about the client's commitment to the "no surprises" credo, find a small, no-brainer problem to test the waters.  Bring it--and your plan to fix it--to the client in person or over the phone.  None of this passive-aggressive end-of-the-week-hoping-they've-already-left email nonsense.  Nope.  Own it.)


Eliminate the freak-outs, minimise the surprises, and the a lot more of the problems will sort out much sooner.  After over a decade in software, I can pretty much promise you that. 

Monday, July 13, 2015

In scripto veritas*

The last few (work)days have been a blast back to my DOS past--at least in the sense of being very script-oriented.  (The similarities end there.  My first computer, an 8-bit Epson 8088, was powered by two 5.25" floppy drives and was certainly not networked.) I've had the luxury of a web-based interface to the MySQL database, but otherwise all interaction with this new server has been via SSH, SCP, and SFTP.  White text on the black background of a command-prompt, in other words.

Threading through the maze of branches of a file system (in the case of SFTP, on the client- as well as server-side) definitely requires a bit more presence of mind than a "windowed" UI.  Particularly when you need to make sure you're not uploading Beta code to a Production system...or vice-versa. Then there's the fact that misspelling anything will generate an error message.  And in a Unix-based system, that means even perfectly matching the capitalisation.

But oddly, the lack of a pretty UI over the top of those interactions is oddly reassuring.  When I launch a script via a command-line and collect its output from a log file, I feel like I can trust what it's telling me.  Ditto when keeping an eye on the system processes after a scheduled (a.k.a. "cron") job launches.  Maybe it's just that I was "raised" on the command-line.  Or maybe it's the fact that *nix doesn't try to protect its users from their own fumble-fingering or short attention spans.

Either way, I appreciate the straight-talk.  Normally, I distrust truth rendered in terms of black-and-white.  But this is definitely an exception.

- - - - -

* I'm riffing on the Latin "in vino veritas," which translates as, "In wine, [there is] truth." 

Monday, July 6, 2015

The irony of "Internet time"

Normally, moving code and data from one server to another is something I try to do in the middle of the weekend (e.g. late Saturday night).  Alas, even the best-calculated schedules sometimes are thrown awry.  In this past weekend's move--a.k.a. "migration"--I was actually ahead of the usual curve of permissions not being correctly set up (which is normally the biggest show-stopper).  But then my connection to the server would drop intermittently.

I've chosen the new hosting company for this application (and others yet to come) largely based on the fact that customer service is a matter of sending an email or picking up the phone.  Not punting a form submission into a queue picked up halfway around the globe.

True to form, debugging has been ongoing since Saturday evening, and the preliminary diagnosis is a DNS issue.

Now, if you're not in I/T, the only thing you really need to know is that DNS (or the Domain Name System) basically functions as the phone book of the internet.   Networked servers, just like phones, are known by a number.  But we humans know the people (and companies) associated with those phones as names.  So, just as you would search WhitePages.ca by name (and city) to find a number, your web browser queries a DNS server to translate the human-readable "www.duckduckgo.com" to the network address "107.21.1.61."

That lookup and translation happen so immediately and (usually) so seamlessly that it's easy to take for granted.  (Unless, of course, you're a Rogers customer.)  Until it doesn't work and a website you know is legit. 404s.

Unlike many other networking issues, DNS problems can take a long time to completely resolve.  The reason is that when a web URL moves from one server (which is a number, remember!) to another, it can drop off the internet's radar.  That's because not all DNS servers update at the same rate--some of them can take up to 72 hours.   It's the price we pay for the decentralisation (and thus the robustness) of the internet.

But boy howdy, does a potential 3-day lag ever slow down debugging.  Not to mention that having a key feature of the modern internet move so glacially feels more than a bit ironic when everything else has been speeding up over the past 20+ years.   But it's not like irony and the internet are strangers, right?

Thursday, July 2, 2015

"How much do you charge to make a website?"

That's a fair question, and as a freelance programmer, the very last thing I will do is roll my eyes at it.  It's just that, when it's asked, I am often the least qualified person to answer it.

Here's why.

But first, a story.

The real estate agent who schlepped us all over La Crosse, WI to help us find our house was an interesting--and instructive--character.  From him I learned the adage, "The head of the fish stinks first."  Also that home-canned beets don't taste horrid like commercial ones.  ("Look at me, Doreen:  I know food.")  And the location of the best coffee shop Downtown.

But he also hammered something into my brain that should have been learned while shopping for our first house:  The "true" price of the house is the price at which it's sold.

That sounds a bit like a solipsism, but it's actually a little more Zen than that.  See, when you buy a house, the price you will pay for it is actually the result of a number of constraints pulling in different directions.  And only one of those constraints, I must stress, is the budget.

For instance...

Constraint:  Family.  Who's going to have to share a bedroom?  How much schedule-Tetris in the bathrooms will be involved getting everyone out the door in the morning?  Is there enough yard to kick the kids off the XBox and into the outside during better weather?

Constraint:  Proximity to Work/School.  Long work commutes suck, which definitely puts a radius on the feasible options.  But if some schools have better reputations than others, that will shut down even more options.  Or maybe the converse is true--maybe you just want to put as much distance as possible between yourself and the rest of everybody.

Constraint:  Resale Value.  Most homebuyers don't plan on coming out the door the last time in a toes-up state.  Is the neighbourhood gentrifying or showing signs of hitting the skids? 

Constraint: Upkeep.  Really, how house-proud are you?  Are you buying it with visions of Norman Rockwell Thanksgiving dinners dancing in your head?

Constraint:  Amenities.  Is single-story required for someone less mobile?  How about shop space?  A dry basement for all those boxes you're not going to get around to opening...maybe not until the next move to see what the heck you put in them this time?  Is going off the grid (even if only after extreme weather) a priority?

The point is that a lot of different--maybe even conflicting--priorities have to be juggled.  Each non-negotiable moves the price up, while compromises often nudge it down.

But here's the thing:  You can't start house-shopping until you do two things:
  1. Identify your priorities--i.e. why are these things important?
  2. Sort out their pecking-order--i.e. why are some more important than others?
Think for a minute how silly it would feel to call a real estate agent and ask her/him, "How much does a house cost?"  (Because agents' commission is a percentage of the sale price, you'd also put yourself in dire danger of being sold more house than you could possibly want and/or take care of.)

Similarly, the cost of a made-to-order software application is (at least partly) a function of your priorities.  And until those questions are answered to your own satisfaction, the cost of matching them can't be answered, either by you or the developer.

There's one critical difference between a house and an app., however:  The "value" of your house is largely bounded by where, geographically, you are willing and able to hang your hat.  You may be able to move the needle only so much on that.  A secondary difference is that the math of determining exactly how much something (e.g. a baby barn) is really worth is pretty fuzzy because it's mixed up with so many other factors.

When, on the other hand, you have to turn away business because you don't have a way of keeping tabs on everything currently in the pipeline, "value" is mere accounting.  Ditto failure rates or schedule slips.  And if it's not mere accounting at this point, please don't waste your money/time by pulling in a software developer until you do have a price tag on that pain.

Now, it's possible that development and associated costs could add up to more than the problem they're meant to solve.  No matter what Silicon Valley tells you, there are still some things better done by human intelligence.  But the problem at least places an upper bound:  Any solution must provide a return on investment.

The best assurance of recouping that investment is to make sure that the developer has as thorough an understanding of the problem as you do.  (Pro tip:  If a developer leads off the meeting with more answers than questions, find another one.)  After that point, "How much do you charge?"  is a completely reasonable question to ask.  But not before, please.