Our current neighbourhood is a little more communal than the previous one. Maybe it's an Acadian thing. But the borrowing and sharing (especially when the gardens are coming in) has an old-time small-town feel. Thus, I was seriously-and-for-realz not at all surprised to see a text from the across-the-back-lawn neighbour asking (out of the blue) whether Dennis would want to borrow his spare chicken-coop in lieu of Dennis having to scramble one together before the snow flies.
Because that's just what good neighbours do, right?
Dennis will probably build to his own design anyway, but it's good to know that it's there in case the weather makes the tarp-covered Quanset-hut run inadequate for the three remaining birds. Because of the need for a heated waterer, we'll need to run electricity out to the coop. Which opens up other possibilities for monitoring things like temperature, humidity, and overall air quality, etc.
The app. I have in mind is Bluetooth-enabled, allowing an Android app. to check in on the current status of those numbers and setting off an alarm if the numbers get out of whack. And there's less than no sense in keeping that design/code to myself if someone else can get the benefit of it for the cost of a few electronics components. So I texted our neighbour to see whether they have any Android devices in the house.
Because that's just what good neighbours do, right?
(Well, not one hundred percent: A second set of feathered guinea-pigs for field-testing benefits me, too. Let's not get too crazy with the altruism angle here...)
Nope, texts the neighbour: They're strictly an Apple household. (Fooey.) But then he asks how they would keep the stored data.
Hoo boy: That's a whole 'nuther basket of eggs. Because now we're talking back-end: Databases, a web application, all that jazz. Granted, that's basically my core competency as a developer. But it's also scope-creep (on radioactive steroids!) at this proof-of-concept juncture.
And yet. It's also useful to know that someone's already thinking long-term. Assuming that I ever wanted to jump through all the UL-type certification hoops to sell this kind of contraption to backyard chicken-herders. Mind you, two data-points do not define a market, but the fact that a single text dragged the scope so far outside the pale tells me that I wasn't thinking "holistically" enough. Even for this six-chicken neighbourhood.
But, then, that's just what good neighbours do, right?
Thoughts on computers, companies, and the equally puzzling humans who interact with them
Showing posts with label Innovation. Show all posts
Showing posts with label Innovation. Show all posts
Tuesday, October 10, 2017
Monday, February 22, 2016
When Empathy > Technology
Dennis's shirts hang on the left side of our shared closet, mine on the right. The closet doors are the typical sliding ones, so that when one side is open, the other side is occluded by door. When the doors are open to my side, light from the window is largely eclipsed by anyone standing in front of it, leaving the job of illumination to anything coming from the left. On the opposite side, it's the opposite story.
Thus, Dennis hangs up his shirts so that they face the right; I hang up mine so they face the left. In a logical Universe, we would respect this light-optimised orientation when hanging up each other's shirts. But even a dual-programmer household falls well short of the Spock/Sherlock ideal. Alack.
The mayhem and havoc wreaked by misaligned clothing can be quantified in terms of extra fractions of a second required to select the t-shirt whose snark and/or geek culture in-joke best matches our mood-of-the-moment. A #firstworldproblem if there ever was one, in other words. But it illustrates the power of personal norms to trump logic (and the instinct to use it). And, in a way, it makes me despair for human progress as driven by first world technology...or even most first world technologists (in whose number I count myself, btw).
Silicon Valley has been panned by folks as diverse as Valleywag and Startup L. Jackson for burning so many calories turning paper millionaires into paper billionaires while infantilising the twenty-something dudebros who are the face of its culture/ethos. The first is just what shareholder capitalism is optimised to do. (The second is just plain pathetic.) Neither of them can be considered truly "disruptive"--at least not in the net positive sense their apologists would have you believe. Sure, it's taking bites out of the taxi and hotel industry by socialising the costs of industries formerly held more accountable via regulation. But, hey, you can't make a creative destruction omelette without breaking a few social contracts, amirite?
It's not even a private sector ailment. NGOs can (and do) squander resources applying first world thinking outside the first world. Case in point: The first attempts to convince Cambodian families to add a lotus-shaped chunk of iron to their cook-pots to reduce/eliminate anaemia fell short. Follow-up visits discovered the iron being used for other purposes, notably doorstops. But casting the iron in the shape of a fish considered "lucky" by locals changed the game. Anaemia has been eliminated in 43% of trial subjects, and a sustainable business model was spawned in the process.
Moral of the story: Sometimes it's the users, not the technologies, that have to be "hacked." The catch is that those of us who are paid to be problem solvers have the instinct to hack technology first. Don't get me wrong--I'm a huge proponent of usability. The bigger a technology's side effects, the more incumbent it is upon its designers to make it as impossible as possible to misuse. I get it.
But the slickest, most bulletproof interface in the 'verse means bupkis if it is A.) Not solving a worthwhile problem, and/or B.) Is too expensive (in terms of cost, infrastructure support, externalised costs, etc.) to use by those who would most benefit from it.
So, to recap, to successfully "disrupt" anything, the designers/developers need to:
Thus, Dennis hangs up his shirts so that they face the right; I hang up mine so they face the left. In a logical Universe, we would respect this light-optimised orientation when hanging up each other's shirts. But even a dual-programmer household falls well short of the Spock/Sherlock ideal. Alack.
The mayhem and havoc wreaked by misaligned clothing can be quantified in terms of extra fractions of a second required to select the t-shirt whose snark and/or geek culture in-joke best matches our mood-of-the-moment. A #firstworldproblem if there ever was one, in other words. But it illustrates the power of personal norms to trump logic (and the instinct to use it). And, in a way, it makes me despair for human progress as driven by first world technology...or even most first world technologists (in whose number I count myself, btw).
Silicon Valley has been panned by folks as diverse as Valleywag and Startup L. Jackson for burning so many calories turning paper millionaires into paper billionaires while infantilising the twenty-something dudebros who are the face of its culture/ethos. The first is just what shareholder capitalism is optimised to do. (The second is just plain pathetic.) Neither of them can be considered truly "disruptive"--at least not in the net positive sense their apologists would have you believe. Sure, it's taking bites out of the taxi and hotel industry by socialising the costs of industries formerly held more accountable via regulation. But, hey, you can't make a creative destruction omelette without breaking a few social contracts, amirite?
It's not even a private sector ailment. NGOs can (and do) squander resources applying first world thinking outside the first world. Case in point: The first attempts to convince Cambodian families to add a lotus-shaped chunk of iron to their cook-pots to reduce/eliminate anaemia fell short. Follow-up visits discovered the iron being used for other purposes, notably doorstops. But casting the iron in the shape of a fish considered "lucky" by locals changed the game. Anaemia has been eliminated in 43% of trial subjects, and a sustainable business model was spawned in the process.
Moral of the story: Sometimes it's the users, not the technologies, that have to be "hacked." The catch is that those of us who are paid to be problem solvers have the instinct to hack technology first. Don't get me wrong--I'm a huge proponent of usability. The bigger a technology's side effects, the more incumbent it is upon its designers to make it as impossible as possible to misuse. I get it.
But the slickest, most bulletproof interface in the 'verse means bupkis if it is A.) Not solving a worthwhile problem, and/or B.) Is too expensive (in terms of cost, infrastructure support, externalised costs, etc.) to use by those who would most benefit from it.
So, to recap, to successfully "disrupt" anything, the designers/developers need to:
- Allow people to benefit their lives/families/communities in a way that was previously impossible
- Allow them to do it in a way that doesn't require huge (for them!) investments or later remediation
- Ensure that misuse is darned near impossible without anything beyond rudimentary training
Monday, November 23, 2015
Hacking the health of a planet
Today's edition of the Toronto Star carried an item that made me smile in three different senses: Scientists Hack DNA to Spread Malaria-Resistant Gene. On a purely topical level, this could be A Really Big Deal. That makes me smile.
On a purely geeky level, the use of the term "hack" was encouraging. I/T folks like myself have been trying for years to make the distinction between "hackers" (the DIY tinkerer types who have no time for Apple-level polish) and "crackers" (the folks who keep the credit monitoring firms in business...and regular I/T folks awake at night). Alas, the rest of the world doesn't make that distinction, so even white-hat "hackers" are tarred with the same proverbial brush.
And yet...no one (geek or non-geek) likes mosquitoes, so who wouldn't get behind "hacking" their DNA, amirite? [grin]
But the question of semantics and PR is the least of the problems here. Because I yet again have to smile--albeit wryly--at the huge obstacle common to hackers across all disciplines. Namely, the gulf between a working prototype in the lab (or hackathon-occupied conference room) and full-scale adoption in the real world.
Don't get me wrong--this is really-most-sincerely NOT schadenfreude. Half a million lives a year are at stake--most of them children. And the health of two hundred million more people each year is in balance. One would be very, very hard-pressed to overestimate the significance of eliminating mosquitoes as a vector for malaria.
There are 3,500 modern species of mosquitoes, and their lineage dates back 226 million years. Before deliberate "gene drive intervention," humans were already triggering the development of new species by dint of pesticides. "Hacking" every skeeter on the planet will be a knock-down, drag-out slog of decades. But it's a worthy fight...and it sure beats the tar out of poisoning the ecosystem with whatever hell's broth Dow is cooking these days.
As a programmer/tinkerer, my goals aren't even a tenth so audacious and world-changing. But that doesn't mean that I can't appreciate the scaling issues. And the wherewithal it will take to surmount them. All the best, my fellow hackers...you're going to need it. And then some.
On a purely geeky level, the use of the term "hack" was encouraging. I/T folks like myself have been trying for years to make the distinction between "hackers" (the DIY tinkerer types who have no time for Apple-level polish) and "crackers" (the folks who keep the credit monitoring firms in business...and regular I/T folks awake at night). Alas, the rest of the world doesn't make that distinction, so even white-hat "hackers" are tarred with the same proverbial brush.
And yet...no one (geek or non-geek) likes mosquitoes, so who wouldn't get behind "hacking" their DNA, amirite? [grin]
But the question of semantics and PR is the least of the problems here. Because I yet again have to smile--albeit wryly--at the huge obstacle common to hackers across all disciplines. Namely, the gulf between a working prototype in the lab (or hackathon-occupied conference room) and full-scale adoption in the real world.
Don't get me wrong--this is really-most-sincerely NOT schadenfreude. Half a million lives a year are at stake--most of them children. And the health of two hundred million more people each year is in balance. One would be very, very hard-pressed to overestimate the significance of eliminating mosquitoes as a vector for malaria.
There are 3,500 modern species of mosquitoes, and their lineage dates back 226 million years. Before deliberate "gene drive intervention," humans were already triggering the development of new species by dint of pesticides. "Hacking" every skeeter on the planet will be a knock-down, drag-out slog of decades. But it's a worthy fight...and it sure beats the tar out of poisoning the ecosystem with whatever hell's broth Dow is cooking these days.
As a programmer/tinkerer, my goals aren't even a tenth so audacious and world-changing. But that doesn't mean that I can't appreciate the scaling issues. And the wherewithal it will take to surmount them. All the best, my fellow hackers...you're going to need it. And then some.
Monday, June 29, 2015
Half-glassing it
Executive overview for folks who don't work in software: There's a stage in an application's life known as the "Beta." It has nothing to do with fish--not enough "T"s, for starters. Basically, it means that the developers--or at least their managers--think that the code is reliable and useful enough to let outsiders look at it. "Beta" used to come in two flavours:
Anyhoo. I'm the developer for an app. scheduled to go into an invitation-only Beta phase within a few weeks. I'm stiffening my spine by re-reading Eric Ries's The Lean Startup for, like, the 2nd or 3rd time (I forget which). Because, by my usual standards, the code (and infrastructure) quite honestly suck. But by "lean" startup standards, that's absolutely OK. In fact, it's considered optimal.
The "lean" methodology takes its cue from the "lean" manufacturing processes that were all the rage a couple decades ago. And while my view of management-by-fad is not significantly different from my view of kings and queens of centuries past (or not so past) who kept pet astrologers, "lean" management at least admits that it's a calculated--yea, even disciplined--response to certain types of uncertainty.
According to "lean" doctrine, the holy grail of the startup is the "MVP," or "Minimum Viable Product." Interestingly, MVPs can originate from established companies (that haven't ossified to the point of knowing nothing beyond milking the existing product-line or desperately scrabbling for a hold on the trailer-hitch of the the latest-greatest bandwagon). The central idea is that you focus your efforts on the single thing that scratches an itch no one else has been able to reach...at least for your target demographic. Sure, not everyone itches in the same place--heck, some folks really only need a vigorous massage, when it comes right down to it. But the early adopters will give you the revenue and insights you need to go mainstream.
The problem is that MVPs are UGH-lee. Features that should be no-brainers are missing. And features are vastly outnumbered by bugs. Performance is slow...assuming the whole thing doesn't crash altogether. Error-logging is practically non-existent. Cut corners and hard-coded "special cases" are the norm rather than the exception. Customisation? That's done by directly editing the database. And automated build-processes? Puh-leeze--getting a new version out the door would make Rube Goldberg weep tears of blood.
That's excruciating to the software developers who have to hold their noses, grit their teeth, and shove the dewy-eyed Beta version out into the cold, cruel world. There's no post-release euphoria, because the pre-release sprint is merely the warm-up for the iron-keyboard marathon of whack-a-mole bug-fixing and the unholy offspring of Tetris and Calvinball that determines the next set of features.
Example: One of the enthusiastic early adopters wants this one extra feature that will put them eighteen months ahead of their closest competition. And--get this!--they're willing to pay for it. Woo-hoo! Problem is, their pet feature so whacked-out that it will change the overall architecture of the app. Worse, absolutely no one else has asked for anything close to this. Ever. So...do you shut up and take their money? Or do you focus your already-overstretched resources on making the app. more appealing to the mainstream? That sort of thing. Fun times. And just another day in Beta-land. Ah, the sexy glamour of software development...
But subjecting my code (yet again) to the rigours of the Real World(TM) makes me realise that, at some point in its life, code exists in a quantum state. There's the optimism (i.e. glass half full) that it will be "good enough" to add value for its early adopters...and have enough time/resources to evolve for a wider ecosystem. There's also the cringing awareness (i.e. glass half empty) of just how much duct tape tenuously holds together too few features...that all have your name (and reputation) on them.
Typically, that "quantum" state isn't long-lived. Which is a mercy, because it's rather uncomfortable. If I were Schroedinger's cat, I'd be gnawing my way out of the box by now.
- Limited Beta, wherein hand-picked outsiders were allowed access to the application. (Read: Friends/relatives/ex-coworkers were nagged into taking it out for a drive.)
- Open Beta, wherein any interested party had access to the app. with no guarantees that it wouldn't crash their computers and/or eat their pets. (Yay, Slashdotters!)
Anyhoo. I'm the developer for an app. scheduled to go into an invitation-only Beta phase within a few weeks. I'm stiffening my spine by re-reading Eric Ries's The Lean Startup for, like, the 2nd or 3rd time (I forget which). Because, by my usual standards, the code (and infrastructure) quite honestly suck. But by "lean" startup standards, that's absolutely OK. In fact, it's considered optimal.
The "lean" methodology takes its cue from the "lean" manufacturing processes that were all the rage a couple decades ago. And while my view of management-by-fad is not significantly different from my view of kings and queens of centuries past (or not so past) who kept pet astrologers, "lean" management at least admits that it's a calculated--yea, even disciplined--response to certain types of uncertainty.
According to "lean" doctrine, the holy grail of the startup is the "MVP," or "Minimum Viable Product." Interestingly, MVPs can originate from established companies (that haven't ossified to the point of knowing nothing beyond milking the existing product-line or desperately scrabbling for a hold on the trailer-hitch of the the latest-greatest bandwagon). The central idea is that you focus your efforts on the single thing that scratches an itch no one else has been able to reach...at least for your target demographic. Sure, not everyone itches in the same place--heck, some folks really only need a vigorous massage, when it comes right down to it. But the early adopters will give you the revenue and insights you need to go mainstream.
The problem is that MVPs are UGH-lee. Features that should be no-brainers are missing. And features are vastly outnumbered by bugs. Performance is slow...assuming the whole thing doesn't crash altogether. Error-logging is practically non-existent. Cut corners and hard-coded "special cases" are the norm rather than the exception. Customisation? That's done by directly editing the database. And automated build-processes? Puh-leeze--getting a new version out the door would make Rube Goldberg weep tears of blood.
That's excruciating to the software developers who have to hold their noses, grit their teeth, and shove the dewy-eyed Beta version out into the cold, cruel world. There's no post-release euphoria, because the pre-release sprint is merely the warm-up for the iron-keyboard marathon of whack-a-mole bug-fixing and the unholy offspring of Tetris and Calvinball that determines the next set of features.
Example: One of the enthusiastic early adopters wants this one extra feature that will put them eighteen months ahead of their closest competition. And--get this!--they're willing to pay for it. Woo-hoo! Problem is, their pet feature so whacked-out that it will change the overall architecture of the app. Worse, absolutely no one else has asked for anything close to this. Ever. So...do you shut up and take their money? Or do you focus your already-overstretched resources on making the app. more appealing to the mainstream? That sort of thing. Fun times. And just another day in Beta-land. Ah, the sexy glamour of software development...
But subjecting my code (yet again) to the rigours of the Real World(TM) makes me realise that, at some point in its life, code exists in a quantum state. There's the optimism (i.e. glass half full) that it will be "good enough" to add value for its early adopters...and have enough time/resources to evolve for a wider ecosystem. There's also the cringing awareness (i.e. glass half empty) of just how much duct tape tenuously holds together too few features...that all have your name (and reputation) on them.
Typically, that "quantum" state isn't long-lived. Which is a mercy, because it's rather uncomfortable. If I were Schroedinger's cat, I'd be gnawing my way out of the box by now.
Wednesday, June 10, 2015
A (wrong) turn of phrase
Today, I was about an hour ahead of schedule for a meeting. I normally can't make enough excuses to visit Moncton's Cafe C'est la Vie, so it's not like I was constantly checking my watch. I'd packed both a tablet loaded with books as well as a treeware one, but instead I ended up mostly ignoring both in favour of the raver conversation going down a few tables away.
The tribulations (or "pain-points," in Marketingese) of being a rave-going millennial actually spawned a business idea. Mind you, it's not the kind that's going to make me dump everything to start hacking out code and recruiting co-founders who grok sales and CSS and mobile and the problem domain, respectively. But it's one that's at least worth exercising my Google-fu. Just not tonight--it's already late and, besides, the current client always takes priority.
But the experience made me question the wisdom of the English cliché, "the idea came to me." Maybe I'm the anomaly here, but generally I have to go to the idea. I just know that this is not an insight I would have had chez fivechimera. 'Nuff said.
The tribulations (or "pain-points," in Marketingese) of being a rave-going millennial actually spawned a business idea. Mind you, it's not the kind that's going to make me dump everything to start hacking out code and recruiting co-founders who grok sales and CSS and mobile and the problem domain, respectively. But it's one that's at least worth exercising my Google-fu. Just not tonight--it's already late and, besides, the current client always takes priority.
But the experience made me question the wisdom of the English cliché, "the idea came to me." Maybe I'm the anomaly here, but generally I have to go to the idea. I just know that this is not an insight I would have had chez fivechimera. 'Nuff said.
Monday, May 18, 2015
Scratching at scale, Part I
All software starts with an itch. Sometimes that itch is practical: "How can I easily manipulate business data as rows and columns?" Enter Lotus 1-2-3. Other times, not so much: "How can I meet and hang out with other Harvard students without actually having to leave my dorm room?" Enter Facebook.
But when folks approach me to develop software for them, the practical-vs.-frivolous distinction isn't the one that matters. Instead, the real question boils down to whether or not the software should be built for scalability.
In I/T terms, "scale" largely translates into, "How many users does this software have to support?" The answer to that question drives decisions such as hardware and even language/platform. But it also drives decisions at a more fundamental level than that--and not only on the I/T side of the team.
Software usage is kind of like a democracy--people can vote. Sometimes votes come in the form of dollars. But even with "free" software, people vote with their feet. That also applies to software that's mandatory--for instance, time-sheet/expense apps. at the office. And, just like political democracies, more users translate into various "tribes" pushing and pulling at how the software is ultimately used. True, its use can be nominally mandatory. But if it's poorly designed (or incomplete), it will be avoided, circumvented, hacked, abused, or misused whenever possible. Anyone looking to it to provide an accurate insight into what's going on is screwed. (Because we all know that false metrics are worse than no metrics, riiiiiiight???)
For someone with an itch looking to build a new (or just improved) back-scratcher, it's critical to understand that the features-to-user ratio does not stay constant as the number of users rises. There will be inflection-points along the way, and they invariably tick upward. To wit:
A veteran software developer will recognise this pattern, and help her/his clients understand the implications--at least from the standpoints of system complexity and maintainability--as each new demographic appears.
Ultimately, though, it's up to the client to understand (even if they might not actually anticipate) the afore-mentioned "inflection points," in the adoption curve, and the risks/rewards of pursuing them. Unless the software being developed is extremely narrow in focus (and its user-base is small), dealing with those inflections requires a bit of mental gymnastics. That's another blog post for another time. Barring unforeseen complications, that time will be Wednesday.
But when folks approach me to develop software for them, the practical-vs.-frivolous distinction isn't the one that matters. Instead, the real question boils down to whether or not the software should be built for scalability.
In I/T terms, "scale" largely translates into, "How many users does this software have to support?" The answer to that question drives decisions such as hardware and even language/platform. But it also drives decisions at a more fundamental level than that--and not only on the I/T side of the team.
Software usage is kind of like a democracy--people can vote. Sometimes votes come in the form of dollars. But even with "free" software, people vote with their feet. That also applies to software that's mandatory--for instance, time-sheet/expense apps. at the office. And, just like political democracies, more users translate into various "tribes" pushing and pulling at how the software is ultimately used. True, its use can be nominally mandatory. But if it's poorly designed (or incomplete), it will be avoided, circumvented, hacked, abused, or misused whenever possible. Anyone looking to it to provide an accurate insight into what's going on is screwed. (Because we all know that false metrics are worse than no metrics, riiiiiiight???)
For someone with an itch looking to build a new (or just improved) back-scratcher, it's critical to understand that the features-to-user ratio does not stay constant as the number of users rises. There will be inflection-points along the way, and they invariably tick upward. To wit:
- A revolutionary back-scratcher, The ItchEraser3000, is invented.
- A few dozen are sent to the reigning "lifestyle" taste-makers. A handful of reviews are written, largely glowing.
- A surge of orders triggers a mad scramble to ship a few thousand out the door before the buzz dies down.
- Uh-oh, several are returned for the money-back guarantee because the ItchEraser3000 was designed and tested exclusively by right-handers.
- Thanks to small manufacturing runs and the miracle of computer-aided design, a version optimised for the left-handed is made available.
- Targeted marketing (and cultivating some of the dissatisfied clientele) lands a mention on the SouthpawsUnited Facebook group.
- A scramble to find a manufacturer capable of larger batches ensues as the virtuous cycle of word-of-mouth grows sales, but life is generally good.
- The cash-flow hit from the January sales slump is exacerbated as users with upper body mobility issues return the IE3000s purchased for them by family members.
- The hit-and-miss process of developing an ergonomic version of both the left- and right-handed ItchEraser is costly, with no word-of-mouth payoff except kudos from one or two advocacy groups.
- The ergonomic models have slow and fitful sales, resulting in a net loss.
A veteran software developer will recognise this pattern, and help her/his clients understand the implications--at least from the standpoints of system complexity and maintainability--as each new demographic appears.
Ultimately, though, it's up to the client to understand (even if they might not actually anticipate) the afore-mentioned "inflection points," in the adoption curve, and the risks/rewards of pursuing them. Unless the software being developed is extremely narrow in focus (and its user-base is small), dealing with those inflections requires a bit of mental gymnastics. That's another blog post for another time. Barring unforeseen complications, that time will be Wednesday.
Thursday, April 30, 2015
The "lean" economy
There's a school of code-slinging known as "agile programming," which is an offshoot of the "lean manufacturing" pioneered by Toyota in the aftermath of WWII. One of the most basic tenants is the commandment to "fail fast." In software, that means getting the most usable features--even in the roughest form--out in front of real users as quickly as humanly possible. (The terminology is MVP, or "Minimum Viable Product," btw.)
But the most useful part of the "fail fast" mantra is the three-fold assumption that:
Coming, as I do, from that operational mindset, I have a difficult time stretching my think-muscles around the willingness of big business and government to compound failure by postponing it. That difficulty is exacerbated by both the money-printing-presses of central banks and the hoarding behaviour of large corporations. Don Pittis' editorial on "zombie economies" really brought that home yesterday.
Pittis is a little vague on the point of zombie companies in particular. But what I suspect he refers to, at least in part, is companies buying back their own stock, financed at least in part with artificially cheap money. Fortune in February noted that large buybacks don't tend to be well-executed by companies for a couple of reasons:
This is about as antithetical to "agile"/"lean" as it gets. Being too afraid to take calculated short-term risks sets up more consequential dangers in the long-term.
Tragically, what the post-2008 coddling of the investor class has amply demonstrated is that failure does not have consequences--at least not for those who screwed up. Anyone who subscribes to the notion of the "rational actor" of neoclassical economics (which I don't) has to admit that its vaunted "rationality" breaks down in the absence of market discipline.
Look. I have a mortgage. But my interest rate ticking up at renewal time is preferable to the damage that an artificially-induced recession will do to my income. Six years (and counting) into the post-bailout malaise, it's amply clear that artificially low interest rates are an anesthetic rather than a cure. Job growth is anemic at best; wages have remained stagnant. Worse, any incentive to save against the next recession has been severely dented by those same low interest rates--not to mention what banks siphon off as fees.
But, hey, at least no one's calling it a Depression, right?
Any sane startup or product manager would have pivoted a looooooong time ago. Yes, I do realise that the world's largest economies (and companies) don't turn on the proverbial dime. But, looking at the experience of Iceland post-2008, I have to wonder whether letting banks fail and using the bailout money to write off mortgages wouldn't have been the better option. Short-term pain vs. long-term gain. That sort of thing.
Plus, you can't deny the soul-satisfying appeal of Iceland tossing its greedy, irresponsible banksters in jail. Alas, the only way North America could scrape up enough political will for that would be to turn it into a particularly trashy reality show. Snap to it, Hollywood!
But the most useful part of the "fail fast" mantra is the three-fold assumption that:
- Failure, at some level, is inevitable.
- Small, inexpensive failures should be insurance against large, catastrophic ones.
- Be willing and ready to "pivot" your business model and/or product at the proverbial drop of a hat.
Coming, as I do, from that operational mindset, I have a difficult time stretching my think-muscles around the willingness of big business and government to compound failure by postponing it. That difficulty is exacerbated by both the money-printing-presses of central banks and the hoarding behaviour of large corporations. Don Pittis' editorial on "zombie economies" really brought that home yesterday.
Pittis is a little vague on the point of zombie companies in particular. But what I suspect he refers to, at least in part, is companies buying back their own stock, financed at least in part with artificially cheap money. Fortune in February noted that large buybacks don't tend to be well-executed by companies for a couple of reasons:
- They're buying stock at relatively high prices, and
- They're spending that money in lieu of "riskier" R&D.
This is about as antithetical to "agile"/"lean" as it gets. Being too afraid to take calculated short-term risks sets up more consequential dangers in the long-term.
Tragically, what the post-2008 coddling of the investor class has amply demonstrated is that failure does not have consequences--at least not for those who screwed up. Anyone who subscribes to the notion of the "rational actor" of neoclassical economics (which I don't) has to admit that its vaunted "rationality" breaks down in the absence of market discipline.
Look. I have a mortgage. But my interest rate ticking up at renewal time is preferable to the damage that an artificially-induced recession will do to my income. Six years (and counting) into the post-bailout malaise, it's amply clear that artificially low interest rates are an anesthetic rather than a cure. Job growth is anemic at best; wages have remained stagnant. Worse, any incentive to save against the next recession has been severely dented by those same low interest rates--not to mention what banks siphon off as fees.
But, hey, at least no one's calling it a Depression, right?
Any sane startup or product manager would have pivoted a looooooong time ago. Yes, I do realise that the world's largest economies (and companies) don't turn on the proverbial dime. But, looking at the experience of Iceland post-2008, I have to wonder whether letting banks fail and using the bailout money to write off mortgages wouldn't have been the better option. Short-term pain vs. long-term gain. That sort of thing.
Plus, you can't deny the soul-satisfying appeal of Iceland tossing its greedy, irresponsible banksters in jail. Alas, the only way North America could scrape up enough political will for that would be to turn it into a particularly trashy reality show. Snap to it, Hollywood!
Thursday, April 16, 2015
Progress by paradox
While waiting for a process to finish, I decided to take a crack at wiring what, chez fivechimera, is snarkily referred to as "The Mark II." It's the second generation of the cat-on-the-counter alarm. The passive infrared sensor on the "Mark I" is doing its job just a little too well--Dennis and I are triggering it merely by passing too close.
Thus, in Version 2.0, the passive infrared sensor (which detects motion) is being supplemented by a distance-sensor which works not unlike a bat's echo-location "sonar." Granted, the distance-sensor's accuracy on a moving cat is less than stellar (and I know this because Office Cat #1 decided to trundle around on my lap while I was testing it...bwoohoohoohahahahahaha.) But after a few rounds of field-testing and tweaking, we should be able to find a workable number-range.
The form-factors of the hardware are sort of conspiring to make the wiring pretty tight. Adafruit's Trinket (assuming it's using male pins) is has been hailed as being "breadboard friendly." But in practice, the mini-USB port barely catches the edge of a full- or half-size breadboard with the result that the pins nearest the port don't completely seat.
A mini-breadboard has less of a border on it, so that's not a problem. But that form-factor requires one to be really creative in wiring. Particularly when the ultrasonic distance sensor is as long as the breadboard itself and realistically has one place it can be plugged in without chewing up too much real estate. And that's all with the fact that the new PIR sensor is downright breadboard-hostile, and will have to be mounted somewhere else.
But we're told that limitations encourage creativity, right?
Yet, I would probably be a little more paranoid about my wiring job if volume pricing (and shipping rates) didn't result in me owning more than one Trinket, infrared sensor, buzzer, etc. By contrast, when I wired up my first project, I cringed at how much I had to wrestle the little push-buttons into the breadboard. By now, I have zero compunction about coercing cooperation from them--with pliers, if necessary. Owning twenty instead of two has a lot to do with that.
Because we're also told that lowering the cost/consequences of failure encourages creativity.
And therein lies the paradox.
On one side, limitations (a form of scarcity) are supposed to make us do more with less--perhaps even achieve what was previously considered impossible. But on the other, the luxury of failing our way to success requires a certain safety margin of abundance.
With that paradox in mind, it's not difficult to empathise with anyone scrambling to bring a new product online before the money runs out. Or a company that has to bring a new offering to the market while still supporting the older offering that's paying the bills. Basically, those folks have to live in two almost diametrically opposed mindsets at once. Given how badly most of us deal with cognitive dissonance, it's pretty amazing when anyone pulls it off.
Thus, in Version 2.0, the passive infrared sensor (which detects motion) is being supplemented by a distance-sensor which works not unlike a bat's echo-location "sonar." Granted, the distance-sensor's accuracy on a moving cat is less than stellar (and I know this because Office Cat #1 decided to trundle around on my lap while I was testing it...bwoohoohoohahahahahaha.) But after a few rounds of field-testing and tweaking, we should be able to find a workable number-range.
The form-factors of the hardware are sort of conspiring to make the wiring pretty tight. Adafruit's Trinket (assuming it's using male pins) is has been hailed as being "breadboard friendly." But in practice, the mini-USB port barely catches the edge of a full- or half-size breadboard with the result that the pins nearest the port don't completely seat.
A mini-breadboard has less of a border on it, so that's not a problem. But that form-factor requires one to be really creative in wiring. Particularly when the ultrasonic distance sensor is as long as the breadboard itself and realistically has one place it can be plugged in without chewing up too much real estate. And that's all with the fact that the new PIR sensor is downright breadboard-hostile, and will have to be mounted somewhere else.
But we're told that limitations encourage creativity, right?
Yet, I would probably be a little more paranoid about my wiring job if volume pricing (and shipping rates) didn't result in me owning more than one Trinket, infrared sensor, buzzer, etc. By contrast, when I wired up my first project, I cringed at how much I had to wrestle the little push-buttons into the breadboard. By now, I have zero compunction about coercing cooperation from them--with pliers, if necessary. Owning twenty instead of two has a lot to do with that.
Because we're also told that lowering the cost/consequences of failure encourages creativity.
And therein lies the paradox.
On one side, limitations (a form of scarcity) are supposed to make us do more with less--perhaps even achieve what was previously considered impossible. But on the other, the luxury of failing our way to success requires a certain safety margin of abundance.
With that paradox in mind, it's not difficult to empathise with anyone scrambling to bring a new product online before the money runs out. Or a company that has to bring a new offering to the market while still supporting the older offering that's paying the bills. Basically, those folks have to live in two almost diametrically opposed mindsets at once. Given how badly most of us deal with cognitive dissonance, it's pretty amazing when anyone pulls it off.
Monday, April 6, 2015
A belated "Happy Birthday"
My bad. I knew last week that Microsoft's 40th birthday was coming up, and then completely spaced it out over the long weekend. My brush with the company actually came fairly late in a Generation Xer's development. My trajectory was TRS-80 to Apple II, followed by a three-year hiatus, and then MS-DOS alternating with early Macintosh for a bit, finally settling on Windows 3.1 and so on up.
These days the Windows development stations don't have much to do except wait for updates and the day when one will end up doing the duty of both and the other is recycled as a home server.
But even as a Linux user, I realise that monoculture, even in operating systems, is infeasible. Particularly in my trade, which is expected to support Mac, Windows, Blackberry, Android, iOS, and various flavours of UNIX. Moreover, as both the Irish Potato Famine and the rash of script-kiddie-fueled havoc of the 2000s taught us, monoculture can be downright disastrous.
Still, it would be distinctly un-Hufflepuff of me not to acknowledge the audacious moon-shot that Bill Gates, Paul Allen, et. al. (nearly) pulled off. A computer on every desk, running DOS/Windows (depending on the decade).
It wasn't Borg-like single-mindedness, nor even plain luck. Painting with a broad brush, some of the key factors in Microsoft's ability to make three billionaires and a whopping 12K millionaires from its 1996 IPO are, IMLTHO:
The Economics of Complementary Products: Meaning that if the price of peanut butter drops, you should be able to sell more jelly as well as more peanut butter (even if the price of jelly remains stable or even rises a little). Personal computer prices were plummeting even as their capabilities followed the logarithmic trajectory of Moore's Law. That made it easier to justify spending more money on software. The trade-off would become more opaque as new PCs and laptops began shipping with pre-installed copies of DOS/Windows. And then the greater volumes spurred more efficiencies (some good, some appalling) in the electronics industry, and...well, here we are...
Apple in a Tail-spin: Jobs had been pushed out by Scully and Woz had bailed as Apple's first act was fizzling out. (If my Gentle Reader's memory of Apple before iPods and Macbooks is a little fuzzy, think Robert Downey Jr. between Ally McBeal and Ironman and you're pretty close to the mark.)
Stepping-stone Partnerships: After Microsoft adapted and optimised (or in programmer parlance, "ported") the CP/M operating systems for IBM PCs, it retained the rights to its own version of the resulting PC-DOS. As PC clones rushed into the market, the go-to operating system (MS-DOS) was no-brainer. Microsoft also collaborated with IBM on the OS/2 operating system, but DOS (and other business ventures) eventually proved too lucrative, and Microsoft dissolved the OS/2 partnership.
A Bigger Bogeyman: Strangely, there was once a time when Microsoft was an underdog. It's generally accepted that, as the PC boom was mushrooming, IBM made superior hardware. Alas, at a higher price-tag and with absolutely zero intention of following anyone's standards but their own. PC-DOS and OS/2 work notwithstanding, Microsoft found itself part of an intrepid little band that also included Intel and Lotus in the battle for how RAM is used by programs. (And people cheered them on!)
"The Second Mouse Gets the Cheese": Because first-mover advantage is overrated. Particularly in new markets where consumers aren't sure what they're supposed to want yet. I think that fairly characterises the mid-1980s when people wanted a computer because all the cool kids had one. (As, later, they just had to have their AOL, and Facebook, and iGadgets, and...)
"Knowing Where the Bodies are Buried": Microsoft's inside knowledge of their operating system gave an insurmountable strategic advantage to developers of the Office suite of products. In other words, it's developers could take short-cuts through the operating system that companies like Lotus (of 1-2-3 fame) or Corel (WordPerfect) couldn't even know without first reverse-engineering.
Lack of Complacency: At least in the 1990s, Microsoft was highly jealous of its dominance, and targeted threats--real and imagined--with extreme prejudice. Which was reflected in Gate's now-(in)famous "Pearl Harbor Day" reaction to an entire realm of software (a.k.a. the internet) that had materialised outside the fiefdom of the desktop operating system. As many of us likely remember, shenanigans ensued. Any web developer who still has to support Internet Explorer 8 could probably make the case that we're still paying for those shenanigans today.
Embracing Hardware Diversity: It's basically the network effect, in that the value of a network is a function of the square of its number of members. Thus, the more things you can do with a computer, the more valuable it is. That encompasses hardware as well as apps. And goodness knows I've wasted enough (sometimes fruitless) time with Linux and wireless to appreciate when "it just works." Yes, there's a Microsoft Mouse, and the XBox, but overall, Microsoft has not let itself become too distracted by new hardware form-factors.
Alas, Microsoft's relationship with third-party developers, historically, has enjoyed levels of dependency and dysfunction normally not found outside Tennessee Williams' plays. Obviously, there has been a lot--and I mean a LOT--of (ahem!) "unsportsmanlike conduct" perpetrated from Redmond, WA. Let's not gloss over that, not by any means. Nope.
Worse, I'm not even referring to Steve Balmer's lizard-brained "cancer" characterisation of open-source software. Microsoft took a PR black eye after it copied a feature from Blue-J, then initiated a patent filing--even after management was aware that the work had been pilfered. In that instance, MS (wisely) backed off. But Windows developers are/were (or should be / have been) all-too-aware of the truth in the ancient Greek blessing "May the Gods stay far from your destiny." To wit: Make a Windows app. or plug-in that's too successful, and Redmond might just bundle a knock-off with the next version of Windows (or the applicable application). Which puts said developer out of business--after taking on all the risk and sweat of prototyping and marketing.
But, hey, at least that's a real product, and not mere weaponised marketing--a.k.a. vapourware. I'd hate to know how many companies or proto-companies tanked simply because someone at Microsoft whispered in the ear of Slashdot or Wired or PCWorld or what-have-you.
Yet.
And yet...
When it's all said and done, I'm not entirely convinced that my Gentle Reader and I would know each other were it not for the aforesaid moon-shot Where I used to work, we had the acronym "BHAG" (pronounced "Bee-hag") for Big, Hairy, Audacious Goals. And there is something--actually multiple and varied somethings--to be said for a company that can score a BHAG. Despite the lean, uncertain times. More importantly, despite the smug, Dom-and-Beluga fat times.
Microsoft, for its all its sins (both myriad and legion), literally changed the world. And I think that needs to be acknowledged...even by open-source partisans such as your faithful blogger.
These days the Windows development stations don't have much to do except wait for updates and the day when one will end up doing the duty of both and the other is recycled as a home server.
But even as a Linux user, I realise that monoculture, even in operating systems, is infeasible. Particularly in my trade, which is expected to support Mac, Windows, Blackberry, Android, iOS, and various flavours of UNIX. Moreover, as both the Irish Potato Famine and the rash of script-kiddie-fueled havoc of the 2000s taught us, monoculture can be downright disastrous.
Still, it would be distinctly un-Hufflepuff of me not to acknowledge the audacious moon-shot that Bill Gates, Paul Allen, et. al. (nearly) pulled off. A computer on every desk, running DOS/Windows (depending on the decade).
It wasn't Borg-like single-mindedness, nor even plain luck. Painting with a broad brush, some of the key factors in Microsoft's ability to make three billionaires and a whopping 12K millionaires from its 1996 IPO are, IMLTHO:
The Economics of Complementary Products: Meaning that if the price of peanut butter drops, you should be able to sell more jelly as well as more peanut butter (even if the price of jelly remains stable or even rises a little). Personal computer prices were plummeting even as their capabilities followed the logarithmic trajectory of Moore's Law. That made it easier to justify spending more money on software. The trade-off would become more opaque as new PCs and laptops began shipping with pre-installed copies of DOS/Windows. And then the greater volumes spurred more efficiencies (some good, some appalling) in the electronics industry, and...well, here we are...
Apple in a Tail-spin: Jobs had been pushed out by Scully and Woz had bailed as Apple's first act was fizzling out. (If my Gentle Reader's memory of Apple before iPods and Macbooks is a little fuzzy, think Robert Downey Jr. between Ally McBeal and Ironman and you're pretty close to the mark.)
Stepping-stone Partnerships: After Microsoft adapted and optimised (or in programmer parlance, "ported") the CP/M operating systems for IBM PCs, it retained the rights to its own version of the resulting PC-DOS. As PC clones rushed into the market, the go-to operating system (MS-DOS) was no-brainer. Microsoft also collaborated with IBM on the OS/2 operating system, but DOS (and other business ventures) eventually proved too lucrative, and Microsoft dissolved the OS/2 partnership.
A Bigger Bogeyman: Strangely, there was once a time when Microsoft was an underdog. It's generally accepted that, as the PC boom was mushrooming, IBM made superior hardware. Alas, at a higher price-tag and with absolutely zero intention of following anyone's standards but their own. PC-DOS and OS/2 work notwithstanding, Microsoft found itself part of an intrepid little band that also included Intel and Lotus in the battle for how RAM is used by programs. (And people cheered them on!)
"The Second Mouse Gets the Cheese": Because first-mover advantage is overrated. Particularly in new markets where consumers aren't sure what they're supposed to want yet. I think that fairly characterises the mid-1980s when people wanted a computer because all the cool kids had one. (As, later, they just had to have their AOL, and Facebook, and iGadgets, and...)
"Knowing Where the Bodies are Buried": Microsoft's inside knowledge of their operating system gave an insurmountable strategic advantage to developers of the Office suite of products. In other words, it's developers could take short-cuts through the operating system that companies like Lotus (of 1-2-3 fame) or Corel (WordPerfect) couldn't even know without first reverse-engineering.
Lack of Complacency: At least in the 1990s, Microsoft was highly jealous of its dominance, and targeted threats--real and imagined--with extreme prejudice. Which was reflected in Gate's now-(in)famous "Pearl Harbor Day" reaction to an entire realm of software (a.k.a. the internet) that had materialised outside the fiefdom of the desktop operating system. As many of us likely remember, shenanigans ensued. Any web developer who still has to support Internet Explorer 8 could probably make the case that we're still paying for those shenanigans today.
Embracing Hardware Diversity: It's basically the network effect, in that the value of a network is a function of the square of its number of members. Thus, the more things you can do with a computer, the more valuable it is. That encompasses hardware as well as apps. And goodness knows I've wasted enough (sometimes fruitless) time with Linux and wireless to appreciate when "it just works." Yes, there's a Microsoft Mouse, and the XBox, but overall, Microsoft has not let itself become too distracted by new hardware form-factors.
Alas, Microsoft's relationship with third-party developers, historically, has enjoyed levels of dependency and dysfunction normally not found outside Tennessee Williams' plays. Obviously, there has been a lot--and I mean a LOT--of (ahem!) "unsportsmanlike conduct" perpetrated from Redmond, WA. Let's not gloss over that, not by any means. Nope.
Worse, I'm not even referring to Steve Balmer's lizard-brained "cancer" characterisation of open-source software. Microsoft took a PR black eye after it copied a feature from Blue-J, then initiated a patent filing--even after management was aware that the work had been pilfered. In that instance, MS (wisely) backed off. But Windows developers are/were (or should be / have been) all-too-aware of the truth in the ancient Greek blessing "May the Gods stay far from your destiny." To wit: Make a Windows app. or plug-in that's too successful, and Redmond might just bundle a knock-off with the next version of Windows (or the applicable application). Which puts said developer out of business--after taking on all the risk and sweat of prototyping and marketing.
But, hey, at least that's a real product, and not mere weaponised marketing--a.k.a. vapourware. I'd hate to know how many companies or proto-companies tanked simply because someone at Microsoft whispered in the ear of Slashdot or Wired or PCWorld or what-have-you.
Yet.
And yet...
When it's all said and done, I'm not entirely convinced that my Gentle Reader and I would know each other were it not for the aforesaid moon-shot Where I used to work, we had the acronym "BHAG" (pronounced "Bee-hag") for Big, Hairy, Audacious Goals. And there is something--actually multiple and varied somethings--to be said for a company that can score a BHAG. Despite the lean, uncertain times. More importantly, despite the smug, Dom-and-Beluga fat times.
Microsoft, for its all its sins (both myriad and legion), literally changed the world. And I think that needs to be acknowledged...even by open-source partisans such as your faithful blogger.
Wednesday, March 18, 2015
Vicious cycles, Part II
Back in 2009, when "Views from the Bridge" were but a wee lass, I poked fun at a weakness I share in common (solidarity?) with other crafters--namely, the infamous Designated Tote Bag Syndrome(TM). As it turns out, I was being too narrow in my satire. Because, apparently, it's not just a crafter thing.
Alas, the Moncton area's options for dealers in off-the-shelf electronic components is pretty thin right now. Also, the loonie-greenback exchange rate being what it is(n't), it's becoming more expensive to mail order certain kinds of stuff. And, as always, the wanna-be mad inventor can also expect to pay through the nose on shipping. That confluence of costs makes bulk-purchasing more attractive. Even for a tight-wad such as your faithful blogger.
So, pitting my frugal Upper Midwestern upbringing against itself (i.e. saving money by dint of spending money), I ended up with eight programmable micro-controllers (a.k.a. "Trinkets"), because at that quantity they were under 10 loonies a pop (ignoring tax and shipping, as our consumer-geared brains are wont to do). Those micro-controllers, however, aren't much use to me without a mechanism to plug them into something--in my case, a breadboard.
But the good folks at CE3 pointed me to BJW Electronics, who were happy to solder headers onto them. (For the record, I'm not deep enough into this hobby to justify a soldering iron plus accessories. That, and my Mom still has the toolbox I tried soldering for her in middle school Industrial Arts, and it's frankly embarrassing--like Nidavellir collectively sneezed on cheap sheet metal. But that's just how Moms roll, so...#whaddyadoamirite?)
That all--again minus HST and whatever shipping costs are associated with me picking them up myself--added about five loons a pop. But it's still better than the $30-40 CDN you'd pay for a full-blown Arduino, yes?
I already have two--okay, probably three--projects earmarked specifically for the Trinkets, and I assured myself that the other six or five will come in handy. Sometime.
But then I thought of Halloween.
And how I can certainly figure out how to make red, white, and yellow LEDs imitate candles in lieu of waiting for the Grande Digue winds to (inevitably) blow out the usual tea lights inside the Jack-o-lanterns.
And there's no reason (apart, of course, from rain) why the LEDs couldn't be triggered by a passive infrared sensor.
And as long as I have to pay postage for a PIR sensor anyway, I might as well get a handful.
And at that point, it might make sense to have another PIR sensor trigger a servo-motor to loose a gravity-propelled "ghost" on a guy-wire strung between the pine trees and our front door.
Aaaaaaannnddd....
My Gentle Reader sees where this is heading, right?
The problem is, now I definitely don't have enough Trinkets to make this happen. And the irony of it all is that we average two sets of trick-or-treaters per annum.
But that's just the problem. With some people, the means infers the motive...and then the motive infers the means...which then again infers the motive...and so on.
Culturally, I suppose, it's a good problem to have. Except for when the credit card bill arrives.
Alas, the Moncton area's options for dealers in off-the-shelf electronic components is pretty thin right now. Also, the loonie-greenback exchange rate being what it is(n't), it's becoming more expensive to mail order certain kinds of stuff. And, as always, the wanna-be mad inventor can also expect to pay through the nose on shipping. That confluence of costs makes bulk-purchasing more attractive. Even for a tight-wad such as your faithful blogger.
So, pitting my frugal Upper Midwestern upbringing against itself (i.e. saving money by dint of spending money), I ended up with eight programmable micro-controllers (a.k.a. "Trinkets"), because at that quantity they were under 10 loonies a pop (ignoring tax and shipping, as our consumer-geared brains are wont to do). Those micro-controllers, however, aren't much use to me without a mechanism to plug them into something--in my case, a breadboard.
But the good folks at CE3 pointed me to BJW Electronics, who were happy to solder headers onto them. (For the record, I'm not deep enough into this hobby to justify a soldering iron plus accessories. That, and my Mom still has the toolbox I tried soldering for her in middle school Industrial Arts, and it's frankly embarrassing--like Nidavellir collectively sneezed on cheap sheet metal. But that's just how Moms roll, so...#whaddyadoamirite?)
That all--again minus HST and whatever shipping costs are associated with me picking them up myself--added about five loons a pop. But it's still better than the $30-40 CDN you'd pay for a full-blown Arduino, yes?
I already have two--okay, probably three--projects earmarked specifically for the Trinkets, and I assured myself that the other six or five will come in handy. Sometime.
But then I thought of Halloween.
And how I can certainly figure out how to make red, white, and yellow LEDs imitate candles in lieu of waiting for the Grande Digue winds to (inevitably) blow out the usual tea lights inside the Jack-o-lanterns.
And there's no reason (apart, of course, from rain) why the LEDs couldn't be triggered by a passive infrared sensor.
And as long as I have to pay postage for a PIR sensor anyway, I might as well get a handful.
And at that point, it might make sense to have another PIR sensor trigger a servo-motor to loose a gravity-propelled "ghost" on a guy-wire strung between the pine trees and our front door.
Aaaaaaannnddd....
My Gentle Reader sees where this is heading, right?
The problem is, now I definitely don't have enough Trinkets to make this happen. And the irony of it all is that we average two sets of trick-or-treaters per annum.
But that's just the problem. With some people, the means infers the motive...and then the motive infers the means...which then again infers the motive...and so on.
Culturally, I suppose, it's a good problem to have. Except for when the credit card bill arrives.
Wednesday, January 14, 2015
Progress is child's play
One of the problems with dead-tree magazines--apart from the obvious dearly departed tree--is that they carry a bit of the "sunk cost fallacy" baggage. To me, this seems particularly true of folks who care enough about a particular publication to subscribe to it. Thus, I occasionally am the beneficiary of slightly out-of-date magazines like National Geographic, Popular Science, etc. (Yeah, some of my tribe is old-school.)
Last December's Discover carried an article about Science Kits aimed at the 20th century American boy...with a nod to Gilbert's (pink!) "Lab Technician Set for Girls." Cringe-worthy cultural norms aside, one paragraph in the description of the Gilbert company's bread-and-butter product, the Erector Set, caught my attention. (Caveat: Wikipedia's photos aren't at all evocative, so here are some more.) I suppose you could call the Erector Set the precursor to Legos...maybe in the same way that wolves are the ancestors of teacup chihuahuas.
[Double-take]
Whoa...waitaminnit...d'y'mean to tell me there was once a Golden Age when engineers actually had that kind of clout???
But before I started cobbling together a time machine from my stash of Legos and Buckyballs, I had a change of heart.
It wasn't just the fact that the glass ceiling for engineering of a century ago, like most everything, was bulletproof. Medicine (in which category I also include nutritional science) was still shedding its aura of quackery. And it certainly didn't standardise, much less scale to the level of an HMO. A few cases in point:
Thus ended my pouting over the engineer-doctor wage differential in these latter decades. Clearly, the health industry is serving us far better than it did a century ago. I also like to think the reversal also means that quality of life has become more valuable than mere things.
- - - - -
P.S.: Shout-out to Dennis, who reminded me to mention the advances in vaccines. Because taking them for granted is bad, as we're once again seeing.
- - - - -
* Canada was 30+ years ahead of the United States (1874 vs. 1906) in regulating food and drug purity, at least at the federal level. This was largely the response to the outcry against the glut of adulterated booze on the market. Which should tell my Gentle Reader everything s/he needs to know about the priorities of the 19th century male. (Women, who might have had something to say about preservatives and fillers and shorted weights at the market, of course didn't have a vote.)
** X-ray applications became even more mainstream in the 1920s, using the fluoroscope to measure shoe fitness and remove unwanted body hair.
*** Pioneered by Ellen H. Swallow Richards, the first female graduate student at MIT, the first female chemist in the U.S.,champion of water quality, nutritionist, and basically a Force of Nature. (Suck it, Gilbert Lab Technician Kit for Girls!)
Last December's Discover carried an article about Science Kits aimed at the 20th century American boy...with a nod to Gilbert's (pink!) "Lab Technician Set for Girls." Cringe-worthy cultural norms aside, one paragraph in the description of the Gilbert company's bread-and-butter product, the Erector Set, caught my attention. (Caveat: Wikipedia's photos aren't at all evocative, so here are some more.) I suppose you could call the Erector Set the precursor to Legos...maybe in the same way that wolves are the ancestors of teacup chihuahuas.
"With pulleys, gears, metal strips and beams (both straight and curved, depending upon the model), screws to fit them all together, and even a DC motor in bigger sets, Erector soon became the gift that mechanically inclined boys wanted for Christmas. Many parents were happy to indulge those wishes during a time when engineers generally earned more than doctors [emphasis mine]."
[Double-take]
Whoa...waitaminnit...d'y'mean to tell me there was once a Golden Age when engineers actually had that kind of clout???
But before I started cobbling together a time machine from my stash of Legos and Buckyballs, I had a change of heart.
It wasn't just the fact that the glass ceiling for engineering of a century ago, like most everything, was bulletproof. Medicine (in which category I also include nutritional science) was still shedding its aura of quackery. And it certainly didn't standardise, much less scale to the level of an HMO. A few cases in point:
- 2014's Ebola deaths numbered under 8,500. The influenza epidemic of 1918-1919 killed 20 million people worldwide.
- Yeah, there was no such thing as a flu shot. The only time-tested vaccine available was for smallpox, and the production/scaling issues were still being worked out. You also took your chances with tetanus, polio, measles (regular and German), diptheria, and whooping cough. Those fighting yellow fever (malaria) put their energies into mosquito control. And you could contract tuberculosis from drinking the milk from an infected cow.
- With absolutely no support from the U.S. Supreme Court, reforming legislators* in the 1900s and 1910s were fighting an uphill battle against poisonous food additives and false therapeutic claims in "patent medicines" laced with narcotics (or worse).
- Contraception--in the form of condoms--only became legal in the U.S. in 1918 after, ahem, "rigorous field testing" in WWI. Women were left to control their own fertility via (cough!) "hygiene" (cough!) products of dubious claims. Canada's 1892 morals-guarding anti-contraception law was repeatedly challeged (and honoured more in the breach than observance) until finally decriminalised in the 1960s.
- Having shaken off the pseudo-science of astrology and magneticism for healing, modern scientific medicine brought radiation out of the lab and into the clinic**. X-rays could spare a patient the dangers of exploratory surgery...albeit while endangering them and those who administered unregulated doses for staggeringly long intervals.
- School lunches (which guaranteed a child at least one nutritious meal per school day) were experimental programs in Philadelphia and Boston***.
- And speaking of Boston, Massachusetts implemented the first water quality standards and municipal water treatment in the nation during the late 1800s, but not all states were on board until the 1970s.
Thus ended my pouting over the engineer-doctor wage differential in these latter decades. Clearly, the health industry is serving us far better than it did a century ago. I also like to think the reversal also means that quality of life has become more valuable than mere things.
- - - - -
P.S.: Shout-out to Dennis, who reminded me to mention the advances in vaccines. Because taking them for granted is bad, as we're once again seeing.
- - - - -
* Canada was 30+ years ahead of the United States (1874 vs. 1906) in regulating food and drug purity, at least at the federal level. This was largely the response to the outcry against the glut of adulterated booze on the market. Which should tell my Gentle Reader everything s/he needs to know about the priorities of the 19th century male. (Women, who might have had something to say about preservatives and fillers and shorted weights at the market, of course didn't have a vote.)
** X-ray applications became even more mainstream in the 1920s, using the fluoroscope to measure shoe fitness and remove unwanted body hair.
*** Pioneered by Ellen H. Swallow Richards, the first female graduate student at MIT, the first female chemist in the U.S.,champion of water quality, nutritionist, and basically a Force of Nature. (Suck it, Gilbert Lab Technician Kit for Girls!)
Monday, December 1, 2014
Two styles of product development
Whew. That was a close call. For a few minutes there, I thought that the house had also eaten James Webb Young's A Technique for Producing Ideas. (Mind you, I can't find the Penguin History of Canada when I need it, so the house still has some 'splainin' to do. But that's another story.)
I have a librarian friend who--unsurprisingly--organises her home bookshelves according to Library of Congress numbering. I'm not so professional m'self, but even so, my History shelves are parsed according to a system...albeit one which likely makes no sense to anyone else. And the delineation between "literature" vs. mere "fiction" admittedly tends to factor in book size to an inordinate degree.
But given my usual categorisation instincts, the fact that I didn't immediately think to look for Young's book in the "software development" neighbourhood is disgraceful, truth be told. Particularly as that's also where Strunk & White and a dot-com-vintage Chicago Manual of Style live. (Anyone who thinks that being a software developer starts and ends at the bytes is--in Web 2.0 parlance--"doin it rong." So sayeth my bookshelf: Your argument is invalid.)
A Technique for Producing Ideas is a slim slip of a book--almost a pamphlet on steroids, really. It dates from the advertising world of the 1940s--notably before "the tee-vee" became the most valued fixture in America's living room. But even a grab-your-belt-and-fly-by-the-seat-of-your-pants autodidact like Don Draper would have known it chapter-and-verse. And it's no less relevant today, when all we First World pixel-pushing proles (allegedly) need to do to hose the backwash of globalisation off our blue suede shoes is "innovate." (This is where I'd love to link to Rory Blyth's brilliant, scathing "Innovidiots" blog-post, but it looks like it's offline indefinitely.)
Absent Mr. Blyth's take on the subject, I think our biggest problem with the term "innovation" is its intimidating suggestion of the blank page. And I don't think I'm making a straw-man argument when I say that, popularly, innovation is too often conflated with creating something ex nihilo. Intellectually, you can look at the portfolio of the most valuable consumer products company on the planet (Apple) : Graphical user interfaces, MP3 players, smartphones, and tablet computers--and know that Woz/Jobs didn't invent a single one of them.
That insight doesn't necessarily help when you're staring into a looming abyss of enforced downtime--yay, holidays. It helps even less to remember that Sir Tim Berners-Lee invented the HTTP protocol on Christmas Day. No pressure there... [grumble]
So to bring things down to the scale of a manageable metaphor, you mainly just need to decide whether you're ultimately the waffle-iron or the crock pot when it comes to making something.
Waffles have the advantage of being very specific--anyone who's been by the grocery story freezer case should have the basic idea down. But the parameters, to a certain extent, are relatively fixed: Starch--typically flour--for body, eggs for structure, some sort of leavening (typically baking soda/powder or yeast) for loft, and milk to make the batter pourable. Too much of one thing and you could have a weird looking hockey-puck or (literally) a hot mess. Moreover, modern electric/electronic waffle irons typically impose limits on temperature.
Within those basic parameters, however, you can make some amazing waffles. (In my world, read "Dennis" for "you.") Making a "sponge" of yeast-leavened batter the night before, and only adding the eggs in the morning, for instance, makes for a revelation in texture. Likewise, eggs can be split into yolks for the main batter, while the whites are frothed and gently folded in afterwards. A touch of vanilla or almond extract? Yes, please. Topped with lingonberry syrup (because you live close enough to Newfoundland/Labrador that it's a staple in Maritime grocery stores)? Bring it.
Waffles are incremental innovation in a nutshell. Evolution, y'understand.
In contrast, there's the crock pot. True, milk and/or eggs probably won't be staples of most recipes. But apart from those, you have a lot of latitude...assuming you respect the laws of Physics. A crock pot will happily simmer organic vegetarian vegetable soup all day. A crock pot will just as happily caramelise cocktail weinies and bottled BBQ sauce into artery-clogging, potentially carcinogenic ambrosia. A crock pot doesn't judge.
In tonight's metaphor, that latitude is what pushes the crock-pot toward the "revolution" end of the invention spectrum.
I'm not particularly partial to either--in fact, I'm delighted when an idea that I consider commoditised is successfully re-invented/re-imagined. LCD monitors, LED light bulbs, thermostats, etc.
But whether you ultimately choose to make waffles or some slow-cooked goodness, the end-goal is the same. Sure, maybe the first few attempts you'll end up feeding to the dog or what-have-you. But ultimately, you have to muster the confidence to serve it to company. Because just as there is no art without an audience, there is no invention without an end-user.
I have a librarian friend who--unsurprisingly--organises her home bookshelves according to Library of Congress numbering. I'm not so professional m'self, but even so, my History shelves are parsed according to a system...albeit one which likely makes no sense to anyone else. And the delineation between "literature" vs. mere "fiction" admittedly tends to factor in book size to an inordinate degree.
But given my usual categorisation instincts, the fact that I didn't immediately think to look for Young's book in the "software development" neighbourhood is disgraceful, truth be told. Particularly as that's also where Strunk & White and a dot-com-vintage Chicago Manual of Style live. (Anyone who thinks that being a software developer starts and ends at the bytes is--in Web 2.0 parlance--"doin it rong." So sayeth my bookshelf: Your argument is invalid.)
A Technique for Producing Ideas is a slim slip of a book--almost a pamphlet on steroids, really. It dates from the advertising world of the 1940s--notably before "the tee-vee" became the most valued fixture in America's living room. But even a grab-your-belt-and-fly-by-the-seat-of-your-pants autodidact like Don Draper would have known it chapter-and-verse. And it's no less relevant today, when all we First World pixel-pushing proles (allegedly) need to do to hose the backwash of globalisation off our blue suede shoes is "innovate." (This is where I'd love to link to Rory Blyth's brilliant, scathing "Innovidiots" blog-post, but it looks like it's offline indefinitely.)
Absent Mr. Blyth's take on the subject, I think our biggest problem with the term "innovation" is its intimidating suggestion of the blank page. And I don't think I'm making a straw-man argument when I say that, popularly, innovation is too often conflated with creating something ex nihilo. Intellectually, you can look at the portfolio of the most valuable consumer products company on the planet (Apple) : Graphical user interfaces, MP3 players, smartphones, and tablet computers--and know that Woz/Jobs didn't invent a single one of them.
That insight doesn't necessarily help when you're staring into a looming abyss of enforced downtime--yay, holidays. It helps even less to remember that Sir Tim Berners-Lee invented the HTTP protocol on Christmas Day. No pressure there... [grumble]
So to bring things down to the scale of a manageable metaphor, you mainly just need to decide whether you're ultimately the waffle-iron or the crock pot when it comes to making something.
Waffles have the advantage of being very specific--anyone who's been by the grocery story freezer case should have the basic idea down. But the parameters, to a certain extent, are relatively fixed: Starch--typically flour--for body, eggs for structure, some sort of leavening (typically baking soda/powder or yeast) for loft, and milk to make the batter pourable. Too much of one thing and you could have a weird looking hockey-puck or (literally) a hot mess. Moreover, modern electric/electronic waffle irons typically impose limits on temperature.
Within those basic parameters, however, you can make some amazing waffles. (In my world, read "Dennis" for "you.") Making a "sponge" of yeast-leavened batter the night before, and only adding the eggs in the morning, for instance, makes for a revelation in texture. Likewise, eggs can be split into yolks for the main batter, while the whites are frothed and gently folded in afterwards. A touch of vanilla or almond extract? Yes, please. Topped with lingonberry syrup (because you live close enough to Newfoundland/Labrador that it's a staple in Maritime grocery stores)? Bring it.
Waffles are incremental innovation in a nutshell. Evolution, y'understand.
In contrast, there's the crock pot. True, milk and/or eggs probably won't be staples of most recipes. But apart from those, you have a lot of latitude...assuming you respect the laws of Physics. A crock pot will happily simmer organic vegetarian vegetable soup all day. A crock pot will just as happily caramelise cocktail weinies and bottled BBQ sauce into artery-clogging, potentially carcinogenic ambrosia. A crock pot doesn't judge.
In tonight's metaphor, that latitude is what pushes the crock-pot toward the "revolution" end of the invention spectrum.
I'm not particularly partial to either--in fact, I'm delighted when an idea that I consider commoditised is successfully re-invented/re-imagined. LCD monitors, LED light bulbs, thermostats, etc.
But whether you ultimately choose to make waffles or some slow-cooked goodness, the end-goal is the same. Sure, maybe the first few attempts you'll end up feeding to the dog or what-have-you. But ultimately, you have to muster the confidence to serve it to company. Because just as there is no art without an audience, there is no invention without an end-user.
Wednesday, November 26, 2014
The flip-side of an old engineering adage
The description of software development as an "engineering" discipline is, to me, one of those "for lack of a better term" bits of taxonomy. Sure, there are marked similarities. In an increasingly networked world, it can truly be said that lives are riding on things working as designed--even when those "things" are merely electrical impulses transmitted between or stored in bits to silicon or magnetic platters.
There's another area where software engineers and all other flavours of engineers certainly do overlap. That's in the common adage, "'Better' is the enemy of 'done.'" In management, it's the mantra of those who have to keep an eye on cash-flow. Below management level, it's the mantra of people who have this filthy habit of wanting to spend time with their families and/or significant others.
Don't get me wrong: I'm totally about 40 hour workweeks and keeping the company solvent. I consider anything less an #epicfail on the part of management.
Yet, what rarely (if ever) is addressed is that the adage has an equal-and-opposite truth:
"Done" is the enemy of "better."
If you didn't instinctively grok the essence of that, I can pretty much guarantee that you will the first time you have to get Version 2.0 out the door. All those corners you had to cut? They're still as sharp as they ever were. All those values you hard-coded as 1:1 relationships because you didn't have time to populate association tables? Consider them diamond-hard-coded now. Yeah--have fun dynamiting those out.
Now, I would never say that those first-iteration shortcuts were ill-judged. After all, this is Version 1.0 we're talking about. One-point-oh is a unique and invariably rude & mercurial beastie. Version 2.0 is our attempt to domesticate it. Flea-dip. De-worming. The dreaded trip to the vet's office. If we play our cards right, it won't eat any (more) couch cushions. If we're very lucky, it won't lick its nethers in the middle of the living room...at least not while company's over.
Problem is--and oh, Friends and Brethren I confess that I also have the stain of this sin upon my soul--too many times we don't take Version 2.0 as seriously as we do 1.0. First generation products are castles in the air that have been successfully brought to earth. That's a massive and commendable achievement. But thinking of Version 2.0 purely in terms of "features we didn't have time for in Version 1.0" is not a forgiveable sin for anyone who aspires to the title of "Software Engineer." After all, no sane engineer would dream of adding turrets and towers to a castle built on sand.
For programmers, the work authorisation for Version 2.0 is an opportunity to pay down some (technical) debt, not just wear the silver numbers off a brand-new debit card. And in actuality, I'm preaching to the choir for the vast majority of programmers. It's the folks who commission programmers to make new stuff for them that I'm hoping to convince here.
Clients: Your programmers saved you a wad of cash up-front when you weren't sure whether that wild-haired idea you had on the StairMaster would actually pan out. But its weekend of garage-tinkering in its boxer shorts is done; let's get this thing showered, shaved, and dressed for real work. Don't be surprised when that takes extra money and time. Whether you were aware of it or not, you're the co-signer on the afore-mentioned 1.0 debt.
That probably comes off as sounding brutal. But I can assure you that it's a mother's kiss in comparison to the sound of a potential customer clicking on a competitor's product instead.
There's another area where software engineers and all other flavours of engineers certainly do overlap. That's in the common adage, "'Better' is the enemy of 'done.'" In management, it's the mantra of those who have to keep an eye on cash-flow. Below management level, it's the mantra of people who have this filthy habit of wanting to spend time with their families and/or significant others.
Don't get me wrong: I'm totally about 40 hour workweeks and keeping the company solvent. I consider anything less an #epicfail on the part of management.
Yet, what rarely (if ever) is addressed is that the adage has an equal-and-opposite truth:
"Done" is the enemy of "better."
If you didn't instinctively grok the essence of that, I can pretty much guarantee that you will the first time you have to get Version 2.0 out the door. All those corners you had to cut? They're still as sharp as they ever were. All those values you hard-coded as 1:1 relationships because you didn't have time to populate association tables? Consider them diamond-hard-coded now. Yeah--have fun dynamiting those out.
Now, I would never say that those first-iteration shortcuts were ill-judged. After all, this is Version 1.0 we're talking about. One-point-oh is a unique and invariably rude & mercurial beastie. Version 2.0 is our attempt to domesticate it. Flea-dip. De-worming. The dreaded trip to the vet's office. If we play our cards right, it won't eat any (more) couch cushions. If we're very lucky, it won't lick its nethers in the middle of the living room...at least not while company's over.
Problem is--and oh, Friends and Brethren I confess that I also have the stain of this sin upon my soul--too many times we don't take Version 2.0 as seriously as we do 1.0. First generation products are castles in the air that have been successfully brought to earth. That's a massive and commendable achievement. But thinking of Version 2.0 purely in terms of "features we didn't have time for in Version 1.0" is not a forgiveable sin for anyone who aspires to the title of "Software Engineer." After all, no sane engineer would dream of adding turrets and towers to a castle built on sand.
For programmers, the work authorisation for Version 2.0 is an opportunity to pay down some (technical) debt, not just wear the silver numbers off a brand-new debit card. And in actuality, I'm preaching to the choir for the vast majority of programmers. It's the folks who commission programmers to make new stuff for them that I'm hoping to convince here.
Clients: Your programmers saved you a wad of cash up-front when you weren't sure whether that wild-haired idea you had on the StairMaster would actually pan out. But its weekend of garage-tinkering in its boxer shorts is done; let's get this thing showered, shaved, and dressed for real work. Don't be surprised when that takes extra money and time. Whether you were aware of it or not, you're the co-signer on the afore-mentioned 1.0 debt.
That probably comes off as sounding brutal. But I can assure you that it's a mother's kiss in comparison to the sound of a potential customer clicking on a competitor's product instead.
Wednesday, November 19, 2014
Working "unplugged"
So there's been a bunch of chatter in recent years about reducing distraction at work...at least when we're not being all collaborative-y, getting loopy on dry-eraser fumes and all that knowledge-pollen we're sharing while singing "Kumbaya" in some corporate war-room.
Turning off your cell phone, powering down the IM client, signing out of social media, even checking out of the digital Hotel California otherwise known as email--that's how I usually understand "unplugged" working.
But what would happen if we also pulled the plug on our workstations?(Assuming that they can run off batteries, of course--regular PCs don't take kindly to that sort of thing.) Human value-systems shift drastically when something previously taken for granted becomes scarce. I can't imagine that electrical current is any different.
I, working on this 2009-vintage Dell would be lucky to see half the battery life of, say, a new-ish Macbook Air. But...would that make a difference in productivity? That's the interesting question.
Maybe, when we know that we need to go heads-down on some chunk of work, we should think about turning that battery indicator into a hourglass. (And, yes, I'm thinking about that scene from The Wizard of Oz.) Scarcity clarifies...and while not the mother of invention, is frequently is the midwife.
Turning off your cell phone, powering down the IM client, signing out of social media, even checking out of the digital Hotel California otherwise known as email--that's how I usually understand "unplugged" working.
But what would happen if we also pulled the plug on our workstations?(Assuming that they can run off batteries, of course--regular PCs don't take kindly to that sort of thing.) Human value-systems shift drastically when something previously taken for granted becomes scarce. I can't imagine that electrical current is any different.
I, working on this 2009-vintage Dell would be lucky to see half the battery life of, say, a new-ish Macbook Air. But...would that make a difference in productivity? That's the interesting question.
Maybe, when we know that we need to go heads-down on some chunk of work, we should think about turning that battery indicator into a hourglass. (And, yes, I'm thinking about that scene from The Wizard of Oz.) Scarcity clarifies...and while not the mother of invention, is frequently is the midwife.
Tuesday, November 11, 2014
A nerdy Remembrance
I decided last evening to break the Monday-Wednesday-Frivolous-Friday pattern of this blog to make a tech-related post relevant to Remembrance/Veteran's Day. Arguably, drone warfare is the ne plus ultra (and probably the reductio ad absurdam besides) of the earliest military apps--namely, target acquisition. But having already covered that plus some origins of the wireless communication that also make drone strikes possible, I was casting about for a fresh twist on the intersection of military and technological history.
WWII buff husband (and kindred geek soul) to the rescue!
I want to make it absolutely, Waterford-crystal-clear that my intent tonight isn't to glorify military applications of technology. But when Dennis mentioned something called the Norden bomb-sight, I was intrigued. Because, after all, the whole point of the contraption was ultimately to drop fewer bombs overall by having them land on people who (arguably) "deserved" them...as opposed to collateral civilian damage. (Which, in turn, requires fewer missions in which "the good guys" will fly into harm's way.) For that to have a reasonable chance of happening, altitude, airspeed, the vector of the aircraft (relative to the target), the speed and vector of the wind (relative to the aircraft) all have to be taken into account. (Remember that the next time you're assigned quadratic equations by the dozen in Algebra. Also, please be grateful that you don't have to solve them while being shot at.)
What Dennis described over dinner frankly sounded like what you'd see nine months after a gyroscope, a telescope, and a slide-rule all woke up during a hazy weekend in Vegas. That's not too far off in some ways, though later incarnations of the device also plugged into the plane's autopilot and even controlled the bombs' release because its precision was thrown off by human reaction-times.
Not surprisingly, this was top-secret stuff, at least until toward the end of The Good War. Norton-made (and rival Sperry-made) bomb-sights cooled their gears in safes between flights, and were supposed to be destroyed in the event of impending capture--even at the cost of American lives. British allies offered to trade the Crown Jewels--meaning sonar technology, rather than the shiny gew-gaws currently on display in the Tower of London--for its use.
Ultimately, however, it was an evolutionary dead-end in technology. It was sold to the Nazis by spies, but never used by a Luftwaffe that preferred dive-bombing. American bombers eventually adopted the carpet-bombing tactics of their RAF counterparts. So why care? (Well, besides the fact that I'm kind of a sucker for analog computing...though I have yet to learn how to use a slide-rule. Bad me.) Alas, it's also a grand textbook-quality example of a technology's life-cycle.
Then, too, as a programmer--particularly one married to a recovering manufacturing engineer--I flatter myself that I have some appreciation of the problems of scaling something new. Sometimes it seems like the real world is nothing but edge cases. At the same time, once it latches onto something better than the former status quo, it typically relinquishes it only after a bitter fight. I sympathise--really, I do.
Yet, if the Norden example is how old men behave when they send young people into possible death and mayhem (with PTSD, addiction, divorce, homelessness, neglect, labyrinthine bureaucracy, and who-knows-what evils to come), the least we can do for current soldiers and future veterans is to give them better old men (and nowadays, old women).
So do me a favour and keep your poppy handy for the next time you head to the polls, okay? For those of us who don't work directly for veterans/soldiers week-in and week-out (i.e., most of us), that's the only kind of "remembrance" that truly matters.
- - - - -
Bibliography:
WWII buff husband (and kindred geek soul) to the rescue!
I want to make it absolutely, Waterford-crystal-clear that my intent tonight isn't to glorify military applications of technology. But when Dennis mentioned something called the Norden bomb-sight, I was intrigued. Because, after all, the whole point of the contraption was ultimately to drop fewer bombs overall by having them land on people who (arguably) "deserved" them...as opposed to collateral civilian damage. (Which, in turn, requires fewer missions in which "the good guys" will fly into harm's way.) For that to have a reasonable chance of happening, altitude, airspeed, the vector of the aircraft (relative to the target), the speed and vector of the wind (relative to the aircraft) all have to be taken into account. (Remember that the next time you're assigned quadratic equations by the dozen in Algebra. Also, please be grateful that you don't have to solve them while being shot at.)
What Dennis described over dinner frankly sounded like what you'd see nine months after a gyroscope, a telescope, and a slide-rule all woke up during a hazy weekend in Vegas. That's not too far off in some ways, though later incarnations of the device also plugged into the plane's autopilot and even controlled the bombs' release because its precision was thrown off by human reaction-times.
Not surprisingly, this was top-secret stuff, at least until toward the end of The Good War. Norton-made (and rival Sperry-made) bomb-sights cooled their gears in safes between flights, and were supposed to be destroyed in the event of impending capture--even at the cost of American lives. British allies offered to trade the Crown Jewels--meaning sonar technology, rather than the shiny gew-gaws currently on display in the Tower of London--for its use.
Ultimately, however, it was an evolutionary dead-end in technology. It was sold to the Nazis by spies, but never used by a Luftwaffe that preferred dive-bombing. American bombers eventually adopted the carpet-bombing tactics of their RAF counterparts. So why care? (Well, besides the fact that I'm kind of a sucker for analog computing...though I have yet to learn how to use a slide-rule. Bad me.) Alas, it's also a grand textbook-quality example of a technology's life-cycle.
- Usability issues? Check. Earlier versions of the device almost required an advanced degree in Mathematics...and pure math nerds could arguably be more useful at Bletchley Park or Nevada. (To its credit, such issues were addressed in future iterations.)
- Prima-donna egos? Check. Its eponymous developer, Carl Norden, had originally worked with his future rival Elmer Sperry, but the two geniuses had parted ways before the onset of the First World World War.
- Over-hyped promises that didn't hold up under field conditions? Check. Jet-stream winds, cloud/smog cover, higher-than-anticipated altitudes (required to avoid detection and anti-aircraft fire) and a host of other issues put the lie to marketing claims of dropping bombs into "pickle-barrels," and users were forced to develop workarounds (literally) on-the-fly. (Worse, failures have been too often blamed on operator error. Oh-so not cool, yo.)
- Engineering vs. Management? Check. Mercifully, the Navy paired up Army Col. Theodore Barth as yin to Norden's yang. The two became not only a formidable combination but good friends.
- Politics? Check, check, and check. Army vs. Navy. U.S. vs. U.K. Not to mention Round II of Sperry vs. Norden, when the former was called on to take up the slack in the latter's ability to keep up with wartime demand.
- Prolonged obsolescence due to bureaucratic inertia? Check. When last sighted (pun intended) Norton bomb-sights were dropping reconnaissance equipment in Vietnam.
Then, too, as a programmer--particularly one married to a recovering manufacturing engineer--I flatter myself that I have some appreciation of the problems of scaling something new. Sometimes it seems like the real world is nothing but edge cases. At the same time, once it latches onto something better than the former status quo, it typically relinquishes it only after a bitter fight. I sympathise--really, I do.
Yet, if the Norden example is how old men behave when they send young people into possible death and mayhem (with PTSD, addiction, divorce, homelessness, neglect, labyrinthine bureaucracy, and who-knows-what evils to come), the least we can do for current soldiers and future veterans is to give them better old men (and nowadays, old women).
So do me a favour and keep your poppy handy for the next time you head to the polls, okay? For those of us who don't work directly for veterans/soldiers week-in and week-out (i.e., most of us), that's the only kind of "remembrance" that truly matters.
- - - - -
Bibliography:
- skylighters.com: http://www.skylighters.org/encyclopedia/norden.html
- Wikipedia: https://en.wikipedia.org/wiki/Norden_bombsight
- twinbeech.com: http://www.twinbeech.com/norden_bombsight.htm (Lots of photos + a contrarian view here.)
Monday, November 10, 2014
Software innovation, security, and the chain of plus ça change
I've been thinking of sending a client a copy of Geoffrey Moore's Crossing the Chasm to give him an inside perspective in launching a new software offering. Whenever I do that sort of thing, though, I always re-read the book myself, in case it comes up in discussion. It's a fantastic book--don't get me wrong--but it's making me grind my teeth because my copy is the 2nd edition from 1998. That the company/product references are stale isn't so bad--c'mon, I'm a History grad. It's the feeling that I might be missing insights relevant to our broadband, mobile-driven, social media phase of the internet age.
Moore's non-tech references have sent me scurrying out to Wikipedia a few times so far. One of those references was a quote by Willie Sutton, a prolific--though gentlemanly--bank-robber of the mid 20th century. One of Sutton's nick-names was "the actor," because his preferred M.O. was impersonating people who would have a legitimate reason to be in the bank, jewelry store, etc. as a non-customer. Additionally, one of his prison escapes involved dressing as a guard. The true brazenness of that escape was in how, when he and his fellow escapees were caught in the glare of a searchlight as they were putting a ladder against the wall, Sutton shouted, "It's okay!" and the gang was allowed on its merry way.
Sutton caught my interest not because of his apocryphal quote, but because of his later career as a security consultant, writer, and general casher-in on his notoriety. Toward the end of his life, he was even featured in advertisements for photo-IDed MasterCards, which (tongue-in-cheek) advised bank customers to "tell them Willie sent you."
It was impossible to miss the parallels with the only slightly less flamboyant Kevin Mitnick, over whom the media and law enforcement of the Clinton Administration worked themselves into a hysterical lather*.
Mitnick repeatedly stresses that his "successes" were more due to social engineering than software engineering. To quote an interview with CNN:
In other words, the art of impersonation strikes again. Also like Sutton, Mitnick's career after "going straight" (in the parlance of gangster movies) involves hiring out his expertise to people who want to defend themselves against people just like him. And, of course, writing books.
Which--in the cluttered curiosity shop I fondly call a memory--calls to mind parallels even further afield in time and space. My Gentle Reader will not be shocked to learn that the "father of modern criminology" and history's first private detective was a reformed criminal. (Also unsurprising: Vidoq's appeal for storytellers and novelists, which presumably didn't dent the sales of his own ghost-written autobiography.)
Then, too, in this part of Maritimes Canada, I only have to drive a few hours to view the remains of 17th- and 18th-century star forts in various states of preservation/restoration. The star fort has its origins in the 15th century (as a response to the innovation of cannon). But the example of Fort Anne in Annapolis Royal, Nova Scotia brings to memory the name of the Marquis de Vauban. Vauban's career under Louis XIV was doubtless capped by his gig as Marshal of France. But that career was made as an expert in both breaking and defending such fortifications. (In other words, he was a one-man arms race. I'm sort of shocked that he didn't write an autobiography, too.)
Doubtless, My Lord de Vauban would strongly object to being compared with the above rogues, however they might have redeemed themselves to society. Yet the parallel is undeniably apt, even for an age defended by earthen walls rather than firewalls. The best defender is an accomplished (though hopefully reformed) offender, it seems.
Long--and meandering--story short, I'm probably fretting needlessly about missing any new insights on ideas that have been relevant since 1990 (when Crossing the Chasm was first published). As we've seen, very rarely is there anything truly new under the proverbial sun. But, hey, as long as I'm already making a trip to the bookstore anyway...
- - - - -
* "While in Federal custody, authorities even placed Mitnick in solitary confinement; reportedly, he was deemed so dangerous that if allowed access to a telephone he could start a nuclear war by just whistling into it." - Forbes. 2013.04.11
Moore's non-tech references have sent me scurrying out to Wikipedia a few times so far. One of those references was a quote by Willie Sutton, a prolific--though gentlemanly--bank-robber of the mid 20th century. One of Sutton's nick-names was "the actor," because his preferred M.O. was impersonating people who would have a legitimate reason to be in the bank, jewelry store, etc. as a non-customer. Additionally, one of his prison escapes involved dressing as a guard. The true brazenness of that escape was in how, when he and his fellow escapees were caught in the glare of a searchlight as they were putting a ladder against the wall, Sutton shouted, "It's okay!" and the gang was allowed on its merry way.
Sutton caught my interest not because of his apocryphal quote, but because of his later career as a security consultant, writer, and general casher-in on his notoriety. Toward the end of his life, he was even featured in advertisements for photo-IDed MasterCards, which (tongue-in-cheek) advised bank customers to "tell them Willie sent you."
It was impossible to miss the parallels with the only slightly less flamboyant Kevin Mitnick, over whom the media and law enforcement of the Clinton Administration worked themselves into a hysterical lather*.
Mitnick repeatedly stresses that his "successes" were more due to social engineering than software engineering. To quote an interview with CNN:
"A company can spend hundreds of thousands of dollars on firewalls, intrusion detection systems and encryption and other security technologies, but if an attacker can call one trusted person within the company, and that person complies, and if the attacker gets in, then all that money spent on technology is essentially wasted. It's essentially meaningless."
In other words, the art of impersonation strikes again. Also like Sutton, Mitnick's career after "going straight" (in the parlance of gangster movies) involves hiring out his expertise to people who want to defend themselves against people just like him. And, of course, writing books.
Which--in the cluttered curiosity shop I fondly call a memory--calls to mind parallels even further afield in time and space. My Gentle Reader will not be shocked to learn that the "father of modern criminology" and history's first private detective was a reformed criminal. (Also unsurprising: Vidoq's appeal for storytellers and novelists, which presumably didn't dent the sales of his own ghost-written autobiography.)
Then, too, in this part of Maritimes Canada, I only have to drive a few hours to view the remains of 17th- and 18th-century star forts in various states of preservation/restoration. The star fort has its origins in the 15th century (as a response to the innovation of cannon). But the example of Fort Anne in Annapolis Royal, Nova Scotia brings to memory the name of the Marquis de Vauban. Vauban's career under Louis XIV was doubtless capped by his gig as Marshal of France. But that career was made as an expert in both breaking and defending such fortifications. (In other words, he was a one-man arms race. I'm sort of shocked that he didn't write an autobiography, too.)
Doubtless, My Lord de Vauban would strongly object to being compared with the above rogues, however they might have redeemed themselves to society. Yet the parallel is undeniably apt, even for an age defended by earthen walls rather than firewalls. The best defender is an accomplished (though hopefully reformed) offender, it seems.
Long--and meandering--story short, I'm probably fretting needlessly about missing any new insights on ideas that have been relevant since 1990 (when Crossing the Chasm was first published). As we've seen, very rarely is there anything truly new under the proverbial sun. But, hey, as long as I'm already making a trip to the bookstore anyway...
- - - - -
* "While in Federal custody, authorities even placed Mitnick in solitary confinement; reportedly, he was deemed so dangerous that if allowed access to a telephone he could start a nuclear war by just whistling into it." - Forbes. 2013.04.11
Friday, November 7, 2014
Frivolous Friday, 2014.11.07: What is your computer age?
It's probably a textbook case of priming, but after a Facebook exchange with pal Larry earlier this week, the "What is your real age?" (cough!) "quiz" (cough!) seemed to pop out of my time-line like baby Surinam sea toads hatching from their Mom's back.
Larry was taking exception to the fact that the cringe-worthy use of "literally" by people who really mean "figuratively" is receiving official recognition. Doubtless, the Romans seeing Alaric's Visigoths on the horizon felt much the same.
The English major who inhabits those corners of my soul still perfumed by old books and fresh ink is not unduly concerned. After all, this is the natural order of things. The person who lives where two languages blend smiles and agrees. The History major squawks, "Just be thankful you're statistically likely to live long enough to witness it!"
My inner I/T Geek just rolls her eyes and thinks, "Oh, honey, please."
I'm already feeling d'une certaine age as it is. Granted, I've thus far been spared the horror of hearing my all-time favourite tune from high-school re-booted by a dreckish pop-star/boy-band who won't be around long enough to be filked by Weird Al. But it's bad enough hearing covers of crap that should have stayed buried alongside the first Reagan Administration. (Ditto movies. I mean, seriously-and-for-realz-y'all, was Footloose not actually bad enough the first time around???)
But compared to measuring age by computer advances, that pales to #FFFFFF. Go back to the line in Apollo 13, where Tom Hanks' character talks of "computers that only take up one room." I was born when they were still multi-room. Gordon Moore had already made what must have seemed like pie-in-the-sky predictions about the computing capacity of the future, at least to the specialists in the know.
But advances miniaturisation meant that permanent storage (a.k.a. hard drives) had actually been removable for several years. What's more, you could actually own them instead of renting them from IBM, who needed a cargo plane to ship 5 megabytes to you a decade or so earlier.
My step-sisters had "Pong" in the late 70s, but it wasn't until the (very) early 1980s when the middle school computer lab teacher humoured me by actually letting me store my (admittedly) pathetic attempt at re-creating "Space Invaders" onto cassette tape. Our TRS-80s and TRS-80 IIIs didn't actually have removable storage. For normal programming assignments, you printed out your BASIC program and its output in 9-pin dot-matrix on 15cm wide silvery paper (that picked up fingerprints like nobody's business), stapled it to your flow-chart, and turned off the computer (losing your program for good).
By high school, we had the mercy of Apple IIes and (360 KB) 5.25" floppy drives--i.e. no retyping programs from scratch if you screwed up. And 5.25" floppies it remained through college--CDs were what music came on...if you weren't still copying LPs onto cassette for your Walkman. I carried two of them in my backpack. One was the DOS boot disk, and the other the disk that held both my word processor (PC-Write) and my papers. Later, I schlepped three whole 5.25" floppies. That was after PC-Write freaked out and somehow sucked my Senior project into its Help section. (True story. The tech. in the University lab could only say, "Wow, I've never seen that happen before," and my BFF the MIS major quizzically enquired, "Didn't you have a backup?" and I looked at her like, "What's a backup?" And my boyfriend spent part of Spring Break helping me type it all back in. I married that guy--small wonder, hey?)
Nowadays, I still carry a backpack. It's green, not stressed denim. It's slightly smaller, because I'm not toting tombstone-weight textbooks. Yet it carries the equivalent of over 186,000 5.25" floppy disks. (In case you're wondering, that thumb drive is actually a step-sibling to the one that lives in my purse. So, yes, I have actually learned my lessons about having a backup. Go, me. [eyeroll]) And that's not counting what's archived to cloud drives at speeds known only to science fiction on the high school modems with their cradles for telephone receivers. (Or, for that matter, even in the days when we were using AOL CDs for coffee-coasters.)
So, despite being born into a time that made the first quarter or so of my life oblivious to personal computing, that's now pretty much impossible for folks in the developed world...much less the constant churn it introduces into daily life. And, when you spend your workdays mostly under the hood, setting your clock to the pace of hardware, software, and networking evolution is a sure way to feel ancient in a hot hurry. (And for cryin' in yer Sleeman's don't even think about measuring time by the lifespans of technology companies.)
Fortunately, for anyone who considers I/T a craft or a calling, rather than a ticket to a middle-class lifestyle, it's a wellspring of endless possibility. Perhaps even a fountain of youth for those who opt to drink deeply enough.
Larry was taking exception to the fact that the cringe-worthy use of "literally" by people who really mean "figuratively" is receiving official recognition. Doubtless, the Romans seeing Alaric's Visigoths on the horizon felt much the same.
The English major who inhabits those corners of my soul still perfumed by old books and fresh ink is not unduly concerned. After all, this is the natural order of things. The person who lives where two languages blend smiles and agrees. The History major squawks, "Just be thankful you're statistically likely to live long enough to witness it!"
My inner I/T Geek just rolls her eyes and thinks, "Oh, honey, please."
I'm already feeling d'une certaine age as it is. Granted, I've thus far been spared the horror of hearing my all-time favourite tune from high-school re-booted by a dreckish pop-star/boy-band who won't be around long enough to be filked by Weird Al. But it's bad enough hearing covers of crap that should have stayed buried alongside the first Reagan Administration. (Ditto movies. I mean, seriously-and-for-realz-y'all, was Footloose not actually bad enough the first time around???)
But compared to measuring age by computer advances, that pales to #FFFFFF. Go back to the line in Apollo 13, where Tom Hanks' character talks of "computers that only take up one room." I was born when they were still multi-room. Gordon Moore had already made what must have seemed like pie-in-the-sky predictions about the computing capacity of the future, at least to the specialists in the know.
But advances miniaturisation meant that permanent storage (a.k.a. hard drives) had actually been removable for several years. What's more, you could actually own them instead of renting them from IBM, who needed a cargo plane to ship 5 megabytes to you a decade or so earlier.
My step-sisters had "Pong" in the late 70s, but it wasn't until the (very) early 1980s when the middle school computer lab teacher humoured me by actually letting me store my (admittedly) pathetic attempt at re-creating "Space Invaders" onto cassette tape. Our TRS-80s and TRS-80 IIIs didn't actually have removable storage. For normal programming assignments, you printed out your BASIC program and its output in 9-pin dot-matrix on 15cm wide silvery paper (that picked up fingerprints like nobody's business), stapled it to your flow-chart, and turned off the computer (losing your program for good).
By high school, we had the mercy of Apple IIes and (360 KB) 5.25" floppy drives--i.e. no retyping programs from scratch if you screwed up. And 5.25" floppies it remained through college--CDs were what music came on...if you weren't still copying LPs onto cassette for your Walkman. I carried two of them in my backpack. One was the DOS boot disk, and the other the disk that held both my word processor (PC-Write) and my papers. Later, I schlepped three whole 5.25" floppies. That was after PC-Write freaked out and somehow sucked my Senior project into its Help section. (True story. The tech. in the University lab could only say, "Wow, I've never seen that happen before," and my BFF the MIS major quizzically enquired, "Didn't you have a backup?" and I looked at her like, "What's a backup?" And my boyfriend spent part of Spring Break helping me type it all back in. I married that guy--small wonder, hey?)
Nowadays, I still carry a backpack. It's green, not stressed denim. It's slightly smaller, because I'm not toting tombstone-weight textbooks. Yet it carries the equivalent of over 186,000 5.25" floppy disks. (In case you're wondering, that thumb drive is actually a step-sibling to the one that lives in my purse. So, yes, I have actually learned my lessons about having a backup. Go, me. [eyeroll]) And that's not counting what's archived to cloud drives at speeds known only to science fiction on the high school modems with their cradles for telephone receivers. (Or, for that matter, even in the days when we were using AOL CDs for coffee-coasters.)
So, despite being born into a time that made the first quarter or so of my life oblivious to personal computing, that's now pretty much impossible for folks in the developed world...much less the constant churn it introduces into daily life. And, when you spend your workdays mostly under the hood, setting your clock to the pace of hardware, software, and networking evolution is a sure way to feel ancient in a hot hurry. (And for cryin' in yer Sleeman's don't even think about measuring time by the lifespans of technology companies.)
Fortunately, for anyone who considers I/T a craft or a calling, rather than a ticket to a middle-class lifestyle, it's a wellspring of endless possibility. Perhaps even a fountain of youth for those who opt to drink deeply enough.
Friday, October 17, 2014
Frivolous Friday, 2014.10.17: Feline Intelligence
The doors and windows were tightly shut, and the cracks of the window frames stuffed with cloth, to keep out the cold. But Black Susan, the cat, came and went as she pleased, day and night, through the swinging door of the cat-hole in the bottom of the front door. She always went very quickly so the door would not catch her tail when it fell shut behind her.
One night when Pa was greasing the traps he watched Black Susan come in, and he said: "There was once a man who had two cats, a big cat and a little cat."
Laura and Mary ran to lean on his knees and hear the rest.
"He had two cats," Pa repeated, "a big cat and a little cat. So he made a big cat-hole in his door for the big cat. And then he made a little cat-hole for the little cat."
There Pa stopped.
"But why couldn't the little cat--" Mary began.
"Because the big cat wouldn't let it," Laura interrupted.
"Laura, that is very rude. You must never interrupt," said Pa.
"But I see," he said, "that either one of you has more sense than the man who cut the two cat-holes in his door."
- Laura Ingalls Wilder, Little House in the Big Woods
As Boing Boing reminded me to today, the reputed inventor of the cat-door was none other than Sir Issac Newton. But the surprising part was that he, as legend has it anyway, was guilty of the dual-egress: One hole for the cat and the other for her kittens.
I suppose there are a number of ways that it could be true. Maybe Laura was on the right track. After all, I could totally see one of our cats blocking the door to effectively shut out the other. (I'm looking at you, Rollie!) Or if the step on the original full-size door was too high for a kitten, one cut lower to the ground would make a workable--if ugly--hack.
Then, too, we all probably know someone who demonstrates the stereotypical inverse relationship between brilliance and common sense. And let's face it, the man who invented a new branch of mathematics literally overnight (true story) also made the mistake of reinvesting his profits from the South Sea Bubble's boom back into its bust, and lost his shorts.
Alas, the ahem, cromulence of the tale falls short: Both Newton's claim to the invention and subsequent misuse thereof have both been roundly debunked.
Innovation is, of course, driven by many factors other than genius. We owe the discovery of penicillin, for instance, to straight-up carelessness. Or we could subscribe to George Bernard Shaw's "The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man." Or, when we find a door cut into a door for the convenience of felines, it's easy to suspect that the mother of invention is occasionally whiskered.
Now, if Sir Tim Berners-Lee had a cat around when he was inventing the HTTP protocol and thus the World Wide Web (which currently exists as a platform for cat photos/videos), I think we would be justified in our suspicions.
Sunday, December 5, 2010
Will close enough be good enough?
The theater where Dennis & I spent part of Date Night is advertising live broadcasts of events such as operas and ballets. I thought that was an interesting niche idea in the "where else ya gonna get that?" sense. The exclusivity of such a showing runs side-by-side with a limitation of choice (i.e. see it then or not at all) that makes up the mind one way or another. (Anecdotal case in point: Of the movies I've seen in theater, the one I've most often attended is only offered--in this area, anyway--at my UWEC, and then only once a year at midnight.)
But for all the exclusivity, tuning in "live" to something happening at Bayreuth, The Hollywood Bowl, La Fenice, Royal Albert Hall, Radio City Music Hall, etc. from the comfort of your local cineplex is still not the same as being there. I mean, granted, it's not like anyone would be silly enough to do Tony and Tina's Wedding that way, and theater-in-the-round would be a suboptimal experience at best.
But the question boils down to whether "close enough" is "good enough." When Dennis & I finally had the means to venture over to the UK, seeing a Shakespeare play at The Globe was--surprise!--non-negotiable. The experience was pretty amazing facsimile, all in all: Paying good money to be crammed onto wooden benches, using the knees of the patron behind you as a backrest, occassionally feeling a September drizzle blow in from the open part of the roof. Never mind that the underpinnings of the 17th century original lay under an apartment complex a few blocks off: It was the best to be had.
Which will probably also be true sentiment of the arts-lover who can't swing a trip to the actual production's venue. But whether the approximation will be good enough is another question. I think that, as the gap continues to close between our immediate world and that which can be brought to us via technology, that question will become more and more complex...and interesting.
But for all the exclusivity, tuning in "live" to something happening at Bayreuth, The Hollywood Bowl, La Fenice, Royal Albert Hall, Radio City Music Hall, etc. from the comfort of your local cineplex is still not the same as being there. I mean, granted, it's not like anyone would be silly enough to do Tony and Tina's Wedding that way, and theater-in-the-round would be a suboptimal experience at best.
But the question boils down to whether "close enough" is "good enough." When Dennis & I finally had the means to venture over to the UK, seeing a Shakespeare play at The Globe was--surprise!--non-negotiable. The experience was pretty amazing facsimile, all in all: Paying good money to be crammed onto wooden benches, using the knees of the patron behind you as a backrest, occassionally feeling a September drizzle blow in from the open part of the roof. Never mind that the underpinnings of the 17th century original lay under an apartment complex a few blocks off: It was the best to be had.
Which will probably also be true sentiment of the arts-lover who can't swing a trip to the actual production's venue. But whether the approximation will be good enough is another question. I think that, as the gap continues to close between our immediate world and that which can be brought to us via technology, that question will become more and more complex...and interesting.
Thursday, December 2, 2010
Never unnecessarily flatten the adoption curve
Yes, I'm afraid I'm dinging Microsoft again for the ASP "Classic" -> ASP.NET thing again. To their credit, when ASP.NET version 1 was first released, Microsoft tried to nudge developers past the pain of upgrading by providing a free migration tool. Which was smart. But that tool won't do me much good, seeing how the language is up to version 4.0. So I spent a bit of time today looking for the 2010 edition of that tool. At first I thought that perhaps my Google-fu was not strong today. Now I'm pretty sure that's not the case. (Although--and I didn't think of this until just now--MS is pretty notorious for changing names of things. I'll take another crack at it tomorrow.)
Yet if I'm proven wrong, I still think that the point largely stands. See, it's not the 1.0 crowd you need to worry about migrating so much as the 2.0 and 1.0 crowd. Reason being, 1.0 is wired more adventurously. Everyone else is waiting to see whether the pioneers survive their first winter (or two) and--more importantly, whether the risk paid off. During the wait, legacy code will continue to pile up, and the new language will undergo even more changes that can make converting even more painful.
Unlike learning curves (which ideally should be shallow for most of their length), adoption curves should be steep: If elapsed time is the X-axis and number of adopters is the Y-axis, it's more profitable to front-load that curve with a high number and worry about the stragglers later. In a disruptive, innovative industries, this is the pipe dream of entrepreneurs and venture capitalists. This is the sort of thing that creates industry standards--without knock-down-drag-out trench wars like that of VHS and BetaMax.
And, in my opinion--not necessarily 100% objective, mind you--Microsoft blew it. Whoever made the decision not to migrate the migration tool to the next few iterations of ASP.NET deserves a bonking on the noggin with a copy of Crossing the Chasm. A decision which may have more than a little to do with the fact that Microsofts web server now supports the open source PHP programming language. Only think how unthinkable that would have been to the incarnation of Microsoft that released .NET 1.0! For all I know, I could be making a classic post hoc, ergo propter hoc error here, but boy-oh-boy is it a tempting inference to make.
Yet if I'm proven wrong, I still think that the point largely stands. See, it's not the 1.0 crowd you need to worry about migrating so much as the 2.0 and 1.0 crowd. Reason being, 1.0 is wired more adventurously. Everyone else is waiting to see whether the pioneers survive their first winter (or two) and--more importantly, whether the risk paid off. During the wait, legacy code will continue to pile up, and the new language will undergo even more changes that can make converting even more painful.
Unlike learning curves (which ideally should be shallow for most of their length), adoption curves should be steep: If elapsed time is the X-axis and number of adopters is the Y-axis, it's more profitable to front-load that curve with a high number and worry about the stragglers later. In a disruptive, innovative industries, this is the pipe dream of entrepreneurs and venture capitalists. This is the sort of thing that creates industry standards--without knock-down-drag-out trench wars like that of VHS and BetaMax.
And, in my opinion--not necessarily 100% objective, mind you--Microsoft blew it. Whoever made the decision not to migrate the migration tool to the next few iterations of ASP.NET deserves a bonking on the noggin with a copy of Crossing the Chasm. A decision which may have more than a little to do with the fact that Microsofts web server now supports the open source PHP programming language. Only think how unthinkable that would have been to the incarnation of Microsoft that released .NET 1.0! For all I know, I could be making a classic post hoc, ergo propter hoc error here, but boy-oh-boy is it a tempting inference to make.
Subscribe to:
Posts (Atom)