Thoughts on computers, companies, and the equally puzzling humans who interact with them
Tuesday, September 22, 2015
Shaving the zebra
Because I have to work in two different worlds (namely, Linux and occasionally Windows, although not yet Apple's walled garden), I've saved myself a lot of cranium-brusing by investing in a KVM switch. This device allows you to toggle between multiple computers using the same monitor, keyboard, and mouse.
I had originally pulled apart my office for painting, then had to re-connect my main programming PC (Linux) to do some troubleshooting. The monitor remained blank. Due to the urgency, I bypassed the KVM switch for the duration, figuring I'd debug later. "Later" came tonight, when the monitor was still unresponsive. After verifying the spaghetti of cabling (and switching to another port, and testing another monitor), I hollered for Dennis to sanity-check for me. He did some poking and prodding, but couldn't find anything amiss.
So I dissembled the KVM wiring and again hooked the peripherals directly into the PC. Again, everything but the monitor seemed to come online. So back under the desk I dove, and discovered what both of us had missed--namely, that this particular model has not one, but two DVI (i.e. monitor ports). We'd been plugging things into the top one (supposedly interfaced with the motherboard, but not really) rather than bottom the one attached to a big, honkin' (like, early 1990s size) video card.
There's a folksy adage that goes, "When you find a dead man with hoof marks across him, look for a horse before you go looking for a zebra." I'd like to report that I learned this at my Grandma's knee. But the fact is that I picked it up from the only episode of Doogie Howser I can ever remember watching. Anyway, the logic is merely a variation on Occam's Razor. Problem is, we tend to scan for things top to bottom, and, well, why would you continue to look for something you've already found?
The experience illustrates why, despite it being more or less the foundational to scientific reasoning, we can still cut ourselves on Occam's Razor through laziness or over-confidence. Because if the afore-mentioned dead guy is found on the plains in Africa, the familiar assumptions become useless--even counter-productive.
I suppose that's the point of the celebrated "Five Whys" of the Toyota Production System: It forces debugging beyond the immediate and superficial.
As it turns out, the KVM switch is still hosed. Naturally, I only learned this only after hooking up everything through it again. #mumblegrumblemumblegrumble But at least there was more certainty in the debugging this time around (verifying with a laptop whose video output the monitor likewise ignored). I'll ping Matt at BJW Electronics tomorrow to see whether repairs are even an option in this case. I fervently hope so, and not just because I dislike adding to the landfill: KVM switches are bloody expensive.
Monday, March 30, 2015
"Back to the Future": The coding version
See, over the past 10 years, we've tried training Mister Kitty to stay off the counter. With the result that he's merely become progressively stealthier about it. Which basically makes it an arms race. Opening a new front in the war, the "Mark I" compensates for our less-than-preternatural hoomin hearing.
The thing hanging off a mini-USB cable at the top right is a programmable microcontroller (an Adafruit Trinket). The green rectangle with all the pin-holes is a mini-size solderless breadboard, used to connect all the pieces. The white square inside the green rectangle is a passive infrared (PIR) sensor. The black button under the yellow and blue wires is a Piezo element (for making sounds). And a pushbutton is trying to hide under the second blue wire.
The microcontroller is programmed to listen to the PIR sensor for the signal that it's detected movement of a warm body within its radius of operation. Finding movement, the microcontroller has the Piezo element play a few bars of Tweety Bird's song ("I tawt I taw a puddy-tat..."). The pushbutton arms and disarms the device.
Crude? Sure. Ugly? Yep. But it scored its first success this morning when I busted His Doodness. One yell and he was downstairs, not to be seen for a couple of hours thereafter. The hope is that eventually he'll realise the futility of his schemes and keep his filthy litter-clogs off the counter. And then I can re-use all the parts in other projects.
As a programmer, this sort of project is something of a departure from the norm. For one thing, software takes second place to hardware. As the builder, you're responsible for making sure all the circuitry works and you don't fry components, blow fuses, or burn down the house. That's not something that I normally have to worry about, even when, say, adding more memory to my PC.
But even the software side takes some getting used to. It's not the language (which is a variant of C++, something I learned back in Programmer School) that's the issue. It's the confines of the platform. The Trinket's hardware isn't too far ahead of what went into 80s arcade games. So you're not going to be writing the War and Peace of computer programs, right?
More significantly, there is no operating system--the processor merely runs the same program over and over until power is interrupted or new code is uploaded. Now, if you've never written code for a web browser or a smartphone, the only thing you need to understand is that both these environments are set up to be able to do several things at once.
For instance, when you type a search into Google, each time you type a letter, the web browser goes out to a Google web server and lets it know what's currently in the search-box. The web server responds by sending back its best matches. That's how the results can change even as you type. The major point is that the browser can listen for your key-strokes, fire off its information, and present the results as three loosely-connected processes. In a word: Multi-tasking. Just like we easily-distracted hoomins can do.
The Trinket, by contrast, can only do one thing at a time. In the case of the "Mark I," for instance, it can't listen for the reset button while a note is playing. So I have to sneak in a check on the button's state in between notes. But even then I have to be precise about the timing because the button needs time to "debounce" (don't worry about the definition of that--it's not important) and I have to make sure that doesn't add too much of a pause before the next note.
It took more error-and-trial than I should probably admit to, just to make something relatively simple like that work. I suspect that the browser/smartphone mindset I've lived in for over a decade has a lot to do with that. Unlearning is always harder than relearning.
Trust me, I'm absolutely not whining. Back when the Trinket's processing power would have been considered cutting-edge, I would have had to program it in Assembler. Which is the coding equivalent of building the Taj Mahal one Lego at a time. Maybe that's what hipster programmers do; I dunno.
But I will say that having this little thing basically kick my butt is actually pretty fun in a "retro" sort of way. Building like it's 1982...partying like it's 1999.
Monday, September 29, 2014
Shellshock, herd immunity, and the implications for the "Internet of Things"
Continuing the analogy, UNIX-based systems such as Linux and Mac OSX have enjoyed the luxury of a certain "herd immunity." In part because UNIX-based (a.k.a. "*nix") operating systems were designed to be networked, which meant that defences were baked in from the get-go. Also, in the past, most everyone was running Windows. And Windows users--typically but certainly not always--tend not to be the most tech-savvy. Which, for anyone in the market for notoriety, put the virus-writing bulls-eye on Windows.
Which gave rise to a certain smugness among *nix users that ranged from a sighroll (When will you people learn?") to outright pointing-and-laughing whenever the latest Windows virus was making the rounds.
Last week's "Shellshock" vulnerability may well have brought that smugness to an end...at least blunted it for the foreseeable future. Mainly because the free ride of (perceived) "herd immunity" has come to an end. Apple's OSX has definitely spread from the Graphics Department to other areas of business. Also, the overwhelming majority of web servers are based on some flavour of Linux. Additionally, embedded devices are increasingly based on Linux.
For programmers like myself who do the bulk of their work on Linux laptops or desktops, most of the distributions (e.g. Ubuntu, Red Hat) make it stupid-easy to install security updates. Ubuntu, for one, definitely gets up in your grill about them. (Plus, we generally know better than to ignore them for any longer than absolutely necessary.) By contrast, an Android tablet (which is Linux-based) like my Nexus 7, can be more obsequious, like the velvet-footed, quasi-invisible Victorian butler. (That tiny notification icon in the northwest corner of the screen murmurs, "Would Milady graciously condescend to update her tablet now, or after tea? Very good, marm." But no more than that.) Which is dangerous from the standpoint that updates are easy to ignore, but it's the tablet that's more likely to connect to that coffee-shop wi-fi.
And the proverbial web-enabled toaster? Fugeddabouddit. Its manufacturer was too busy trying to squeeze half a cent off the unit cost (to appease Walmart's accounting goons) to worry about releasing software patches. And that's precisely the problem with the Internet of Things.
And while I don't necessarily like to call for yet more regulation, I think any government that waits until after a catastrophic network attack to require patching is a reckless government. The two major barriers to such a catastrophe are well on their way to becoming null and void. Clearly, the industry cannot be trusted to police itself.
Cost was the first barrier. Unlike Windows or OSX, Linux is free to use. It is--just as importantly--also free to hack up and modify to suit your particular hardware. Either factor is enough to make Linux a no-brainer for electronics manufacturers. Which is fine for standalone gadgets. But when they're exposed to a network (and by extension other devices), an evolving immune system is a non-negotiable.
There's another dimension to the cost factor, and that's in hardware. Previously, only full-size PCs or laptops had the processing power and memory to support a full-blown operating system. Not anymore, when the $40 Raspberry Pi runs off an SD card like the one you probably already have in your camera. Eventually, the only limits on size will be imposed by the need to connect it to monitors and standard ports or peripherals (these days that means USB devices; who knows what hotness tomorrow will bring?).
All that, for a manufacturer, means that they don't have to spend the money and time to develop a custom operating system; they can go with a no-cost off-the-shelf platform.
The second roadblock, scarcity of always-on connectivity, is now disappearing, as public libraries and city buses and in some cases entire cities offer free wi-fi.
The upshot is that we'll have an ubiquitous operating system that is less than likely to be immunised against the latest viruses. And it will be living in the (metaphorical) Grand Central Station of the internet, exposed to any and all comers. Pandemic is not a risk; it's a certainty.
I know that most people could care less if their cyber-toaster tells a black-hat hacker or the NSA or the CSEC that this morning's raisin bagel was (gasp!) not gluten-free. But that's soooo emphatically not the point here. Why? Because hackers do not always want your data. In many cases, they want to siphon your processing power and your bandwidth so they can use it to attack those who do have juicy credit card numbers or email addresses or passwords. or naked celebrity selfies or whatever. Which ultimately means that when anyone's too lazy to keep up with patches, they're aiding and abetting the enemy. And complacence, as we know from politics, is complicity.
Naturally, my Gentle Reader is too savvy and hip to be slovenly about such things. They fully appreciate that even their Fitbit has orders of magnitude of computing and communications power beyond what put people on the moon. And they, beyond question, have the instinctive class and sense of noblesse oblige to know that with great power comes great responsibility.
Of course they do.
Friday, July 1, 2011
Frivolous Friday, 07.01.2011: Founding Hackers
But in the discussion of the illuminati of the "American Experiment," one thing that took me aback--in terms of things that we take for granted--was the claim that if Benjamin Franklin had invented nothing beyond the lightening rod, he would have still been considered a giant in practical science. But, as the kite-flying escapades and some of his more fanciful uses for the new-fangled electricity make for better stories, it's easy to lose sight of the life-and-death aspect.
Sometimes Franklin merely improved on the work of others, such as an early battery called the "Leyden jar" or capturing more heat from a fire with what became known as the "Franklin stove." Other inventions, such as bifocal glasses and the odometer, were--to the best knowledge of history--were hacks created to meet an immediate need.
And in the best spirit of hacking, Franklin could--in a sense--be considered the father of open source. The "sense" in question being that he refused to patent any of his work. From the Wikipedia article on the Franklin stove:
...the deputy governor of Pennsylvania, George Thomas, made an offer to Franklin to patent his design, but Franklin never patented any of his designs and inventions. He believed “that as we enjoy great advantages from the inventions of others, we should be glad of an opportunity to serve others by any invention of ours, and this we should do freely and generously”. As a result, many others were able to use Franklin’s design and improve it.Thomas Jefferson, no less a tinkerer (and a math nerd besides), also dabbled in cryptology during his various duties to the fledgling republic.
However, Jefferson believed in limited-term patents to balance the financial incentive for invention (and thus human progress) against perpetual monopolies that would hurt the public interest. He, like Franklin, did not patent his work on the moldboard plow (basically a hack for the hilly soil in his Piedmont stomping grounds of Virginia.) And, to the manufacturer of a device for producing duplicate copies of one's writing--then known as a "polygraph," although the word has a different meaning now--Jefferson supplied all manner of suggested improvements--and apparently beta-tested them as well--over the course of writing thousands of letters.
I know that we have a tendency to create the founders of this country in our own image, and me highlighting their geeky pedigree is no exception. Yet the trick to biographical history is to never assume you know the people you're researching. And, above all to remember that, when you go looking for history, sometimes history comes looking for you.
Tuesday, May 24, 2011
iPotato
After reading--by which I mean deep-skimming and yes, I do consider that a distinction--the article, the Inner History Major snorted awake and mumbled, "Right. Basically we're talking the Irish Potato Famine. Got it. Zzzzzzzzzzz..." The IHM was mainly thinking of the monoculture component of the famine, what with the concerns about consistency and timing in the supply chain. Not to mention the overhead cost associated with putting the screws to multiple suppliers to insure that one doesn't chisel you out of that fraction of a penny on a three-figure tablet or smartphone: Quelle horreur... (An economy stacked in favor of absentee landlords and de-facto colonialism...that historical parallel pretty much speaks for itself.)
But IHM had a point, and the leftier-brainier part of me couldn't help but wonder: How scalable is scalability itself? True, macroeconomics has reams to say about the virtues of specialization. But even the most greasiest of gears can't avoid some grit. Or--more ominously--flaws and stress-points in the metal itself. Or--as this week has demonstrated--random acts of freakish nature.
(Then, too, the contrarian part of me--smirking sarcastically at every other part from its snarky digs--likewise couldn't help itself and wondered why anyone would shell out half a grand on a tablet to be like every other slavering fanboy/fangirl. Doubtless, the next iPhone/Android phone could easily get away with the schtick so common to '90 boutique catalogs: "Due to the natural variations in outsourced manufacturing, please allow us to select one for you." You can't tell me the Kool-aid swillers would pound that down...)
The geek in me just knows, though, that if you rely on disasters to test your failover plan, you don't really have backup. Which applies to people and their knowledge-sets just as much as it does to hardware and connectivity, by the bye. And baking in a certain amount of slack in lieu of stuffing more eggs into the same basket is, really, what it amounts to. Plus, I figure that if my own trade--programming computers--can be subjected to the ethos of assembly-line manufacturing...hey, we might as well make that botched metaphor a two-way street, no?
Sunday, October 17, 2010
From Kin to #WIN?
But first some backdrop, and foremost the observation that there are still one heck of a lot of people still on XP. Partly because of Vista's reputation for bloatedness. Partly because of the nuisance (and risk) of upgrading. And partly because, at a certain point, an operating system is merely a means to an end, that end being running the programs that help pay the bills. Or, failing that, run programs that at least try to amuse you. Windows 7 looks slicker, certainly, but we're already prepping for a good deal of headache for the cutover, mainly from the standpoint of getting our old familiar programs to work on it. Speaking purely for myself, I'm figuring that the workday normally lost to workstation upgrades can safely be doubled for Win7.
So, that all being said, wouldn't it be interesting if the new phone platform served a double purpose: 1.) Rebounding from the Kin, and 2.) To introduce Windows 7 in a sleeker, sexier context than the desktop? If they pull it off, more power to them. Partly b/c XP could use a dignified retirement even more than Brett Favre. But mostly because just about any market--like politics--benefits from a strong third party.
Saturday, August 28, 2010
Historical perspective
Fortunately, there's a great deal of perspective waiting behind the instinct to revel in the great, good fortune of living & working in the PC/smartphone era. A pair of such moments were waiting for me a week ago at the Army Museum located in the Citadel of Halifax, Nova Scotia.
The first is was an example of headphones. Not exactly the padded featherweight of Bose, these. But that probably wasn't high on the priority list of the folks using them to locate mines in World War II. My maternal grandfather, a mine-sweeper of 54-F Pioneer Company in the First World War, would have rejoiced to have such technology at hand. Preferably before his best friend (also a mine-sweeper) missed a trip-wire and was blown apart while Grandpa could only watch (and/or dive for cover).
The second artifact is what WWII called a mobile phone. Transmission of radio waves (which, in military terms, replaced a terribly vulnerable--wired--telegraph technology) predated even Grandpa, and had already seen use by the British during the Boer Wars, and developed a highly complex system for sending messages to its fleets after war broke out again in 1914. Doughboy Signal Corps units relied on "portable wireless outfits" (i.e. wireless telegraphs that could be comfortably carried by four men or truck-mounted for field use) with antenae ranging between three and four feet tall.
In addition to adding full-fledged telephony, WWII's mobile communications (as shown above) saw great advances in compactness. But, alas, our perennial complaint of losing all bars has a long and dishonorable history: Significantly, it was a contributing factor in the ill-conceived debacle known as Operation Market Garden. Needless to write, the cost was paid in blood and treasure--and dearly.
Which, IMLTHO, may be something to keep in mind when the next iGadget is unveiled to the sound of the trade press hyperventilating over how it "revolutionizes," well, everything. In reality, that particular revolution's been here and gone. Sure, another may come along, but spending all attention on whether or not something has a a built-in camera pretty much guarantees that the next (real) revolution will come out of left field.
Wednesday, July 14, 2010
Rumors of the desktop's demise...
As much as the laptop's keyboard feels natural, my problem-child wrist is a bit grouchy, my lap is quite warm, and most importantly, I frankly haven't done much "work" since losing the desktop that hasn't been word processing or web browsing. Mainly because I'm too spoiled with screen real estate--something that's at a premium, even on a battery-pig like this.
If the work of the future is all about creativity and collaboration, that pretty well guarantees multi-tasking. We have a number of different jobs at my office, but none of us has anything less than either two monitors or one honkin' big one. Why? Because collaboration means email and IM at a minimum. Folks on leaner budgets may also live & die by Skype for communication as well. Then there's the need for research and corroboration: Enter the web browser. Finally, there's whatever tool-set is appropriate to the job. (In my case, you can pretty much count on an integrated development environment, at least one database window, and probably at least one other tool for file comparison, search, or transfer.) I'd happily take a third monitor if I could.
The point is that only something with that physical real estate (plus text input and ready access to all the user interface elements available) allows for information to flow between people and between applications and eyeballs without the cost of interruption. Single-tasking--that darling of the "curated computing" simply won't cut it for anything but specialized applications and recreation. But the cross-disciplinary work that creates those applications and amusements? Trying to build those on a gadget could be the short road to flying lessons for the gadget in question--or at least to bankruptcy for the company that tried to make a profit that way.
Saturday, July 3, 2010
Geeky growing pains
It could have been worse, though: At least my iPhone-packing nephew (and/or his Wii-addicted younger brother) wasn't around to see that near-mortal bout of uncoolness.
Changes have teeth--and have been known to nip. All the same, not changing until it's too late almost always bites. And with that, I'm off to upgrade software and plant my eyeballs back into Reto Meier's excellent book on Android application development.
Sunday, June 6, 2010
Another new "race" in technology
Another riff brought to you by last night's trip into Barnes & Noble. Not so much about books, though, because--for once--the in-your-face front display did not contain books. No, the pride of place normally given to J.K. Rowling and Stephanie Meyer was instead devoted to the Nook eReader. No great shocker there, just elementary retailing in action. In fact, I think it was just last week I'd read that both Amazon and B&N are more aggressively marketing their gadgets.
But the poster-blurb about wi-fi & 3G did catch my attention, along with some mention--I thought--about web browsing. It turns out that the web browsing thing wasn't just the mudslide pie sugar-rush talking after all. Amazon's next-gen Kindle will (allegedly) also have built-in wi-fi--presumably in conjunction with, or in place of WhisperNet.
Together, they only make me (again) wonder how long it will be before accessing an unfettered internet will be a "given" in the same sense as electricity and hot & cold running water. Which is interesting timing, considering how AT&T and Verizon and Comcast and a number of their ilk are dusting off the notion of data-capping and/or actively fighting net neutrality.
In a sense, it's a race to see whether always-on and unmetered access will become either a quasi-right or a privilege in the American mindset. Granted, utilities are typically metered, but phone/data companies try to have it both ways by combining use-it-or-lose-it with overage charges. I mean, seriously, who talks exactly 500/1000/whatever minutes a month? Yeah, didn't think so--which makes it a lose-lose proposition all the way.
Personally, I'm betting on quasi-right. And the driving reason is the explosion in the number of gadgets that come with web browser as a standard feature. Gadgets that are live in a handful of seconds, rather than the time it takes a PC or laptop to go through the operating system equivalent of waking, showering, brushing teeth and reading the paper over coffee. In a phrase, instant gratification. Standing between the American consumer and instant gratification generally isn't smart business. And, having recently rolled my eyes half-dizzy shopping the talk/text/data Happy Meals the brand-name mobile giants offer, I can't say as I'll feel sorry for them when they lose the race.