Kudos to Rooster Andy's for the verbiage I saw on their sign during yesterday morning's commute, which publicly congratulated one of their employees for moving onto another job. That's pretty outstanding, and thought it deserved a shout-out.
I only wish I liked BBQ chicken better...
Thoughts on computers, companies, and the equally puzzling humans who interact with them
Tuesday, June 28, 2011
Friday, June 24, 2011
Frivolous Friday, 06.24.2011: Geek bar jokes
An J2EE JDBC connection walks into a bar. The bartender asks, "What'll it be?" "Nothing right now, thanks," replies the connection as it makes a bee-line for the billiard tables, "I'm just here for the pool."
Richard Stallman walks into a bar. Recognizing him, the barkeep asks, "Hey, Richard! What's GNU?"
A memory slot walks into a bar. Peering down at its diminutive size, the bartender snorts, "You must be a cheap drunk." "Yeah," concedes the memory slot, "I can really only hold one DRAM."
An ICMP ping walks into a bar, twiddles its thumbs at a table for a second, then demands, "How long does it take to get a server around here?"
An HTML <TBODY> tag, <TBODY> tag, and <TFOOT> tag walk into a bar and order a round. When it's about time for seconds, the bartender notices that he can only see the <THEAD> and <TBODY> at their booth. "Where'd your friend go?" asks the bartender. "You mean <TFOOT> ?" reply the <THEAD> and <TBODY> tags, "He's always under the table!"
A web browser walks into a bar where a web server is tickling the ivories at the house piano. The browser sticks a few bills into the web server's jar and asks, "You take requests?"
A *nix print job walks into a bar. After a few rounds, the bartender notices how pie-eyed the job has become and sighs, "In its CUPS again..."
The Java Math class walks into a bar and announces, "Hey, everybody--the next .round() is on me!"
Steve Wozniak walks into the Genius Bar...
A SATA drive and an IDE drive walk into a bar. After nervously glancing around, the IDE disk drive whispers to the SATA, "Are you sure we want to be here? This place looks awfully SCSI to me."
Richard Stallman walks into a bar. Recognizing him, the barkeep asks, "Hey, Richard! What's GNU?"
A memory slot walks into a bar. Peering down at its diminutive size, the bartender snorts, "You must be a cheap drunk." "Yeah," concedes the memory slot, "I can really only hold one DRAM."
An ICMP ping walks into a bar, twiddles its thumbs at a table for a second, then demands, "How long does it take to get a server around here?"
An HTML <TBODY> tag, <TBODY> tag, and <TFOOT> tag walk into a bar and order a round. When it's about time for seconds, the bartender notices that he can only see the <THEAD> and <TBODY> at their booth. "Where'd your friend go?" asks the bartender. "You mean <TFOOT> ?" reply the <THEAD> and <TBODY> tags, "He's always under the table!"
A web browser walks into a bar where a web server is tickling the ivories at the house piano. The browser sticks a few bills into the web server's jar and asks, "You take requests?"
A *nix print job walks into a bar. After a few rounds, the bartender notices how pie-eyed the job has become and sighs, "In its CUPS again..."
The Java Math class walks into a bar and announces, "Hey, everybody--the next .round() is on me!"
Steve Wozniak walks into the Genius Bar...
A SATA drive and an IDE drive walk into a bar. After nervously glancing around, the IDE disk drive whispers to the SATA, "Are you sure we want to be here? This place looks awfully SCSI to me."
Wednesday, June 22, 2011
Short-selling the dinosaurs, or "Here we go again..."
With the rise of the smartphone, the attendant hype has included some talk about the "ghettoization" of the internet--in the sense that "the internet" is defined as content snarfed from one or more web servers from a laptop or even horrifically retro desktop computer. Yet, as I read yet another "the reports of my death have been greatly exaggerated" article, titled The Fall of Wintel and the Rise of Armdroid, it occurred to me that the coming "ghettoization" may not be drawn along the lines of content producers vs. consumers as along content itself.
The distinction between playing a game on a small screen and everything that goes behind it (interface design, scaling data and processing over multiple servers and writing/testing/deploying all the code that makes that happen) is the distinction between the proverbial tip and the iceberg. (Even minus Kate and Leo and a whole lotta CGI). I hope we can agree on that.
Disclosure: I don't own a tablet or smartphone, per se. (Yet.) A netbook--with a keyboard that would have put Margaret Mitchell ("Gone with the Wind") on the sidelines well before Atlanta was toasted--yes. And I've certainly been accused of shallow thinking. And not just recently, nor without justification.
Which, I'll admit, makes it seem more than a little pretentious to swim against the tide of "conventional wisdom." ('Cuz when business writers predict long-term computing trends, it's totally like, "Gartner data-point. Your argument is invalid." 'Nuff said, right?) Even against the swaggering conventionality of dudes like Mr. Allsworth--who, so far as I can tell, think they're scooping the meteor from "Fantasia" just as the dinosaurs double-take the bright light in the sky like some chorus line of "Durrrr." Because we all know how sharply striated the mainframe-to-minicomputer-to-PC adoption was, yes?
Mockery aside, I think I can safely predict that we're living in a Golden Age of niches--perchance even a Cambrian explosion of computing life-forms. Simply because hardware is cheap, software alternatives range somewhere between "cheap" and "free" and tying together systems is not limited to dedicated telephone wires--owned, I might add, by a monopoly. Making the statistical likelihood of such one-or-the-other thinking rather on part with being struck by lightning during a shark attack.
No doubt 24/7 availability of fully networked computers responsive in a more three-dimensional sense will change the equation somewhat. But the fact remains that small screens with cramped user interfaces are geared to forms of content for which a desktop in which you can immerse yourself for twelve hours straight (thanks to three monitors, keyboard, mouse and who-knows-what-besides) are thermonuclear-scale overkill.
For instance: There's snapping a photo, cropping it, tagging it, uploading it--yea even with LOL-caption. There's firing off the multi-person SMS message otherwise known as a tweet--or even skinny post. Stupid-simple, and as close to "free" (in terms of time and money) as possible for both the creator and the recipient. Then there's the longer-term commitment of content on the level of, say, "Avatar" or "Inception" Even bootlegged copies carry the cost of going on two hours of time. (And, in an economy where too many work more hours for less compensation, don't ever make the mistake of discounting the value of "idle" time!)
Seriously now...will the next Lady Gaga video be mixed on an iPad as facilely a throwaway iApp can make caricatures of your photos? Me, I'm thinking not. And not only from the standpoint of raw computing power--something that typically comes in inverse proportions to the prized battery life of such devices. A multi-screened Mac, fully accessorized, by contrast, will capture the nuances that dumbed-down resolutions and tinny, cheap earbuds will not. All the difference in the world between a handful of Facebook friends and millions of "L'il Monsters," in other words.
In short, content is not created equal. Either in the creation or the consumption, I might add. And never will be. Just like sometimes you can get by with the "fun-size" Snickers bar you poached from the communal candy jar--the calories don't count if you pitch the wrapper in your cube-mate's waste-basket. Honest--I read it in "Scientific American." But at other times nothing short of the infamous seventeen layer "death by chocolate" volcano cheesecake torte from the local Tchotchke's will do.
Or something like that.
The distinction between playing a game on a small screen and everything that goes behind it (interface design, scaling data and processing over multiple servers and writing/testing/deploying all the code that makes that happen) is the distinction between the proverbial tip and the iceberg. (Even minus Kate and Leo and a whole lotta CGI). I hope we can agree on that.
Disclosure: I don't own a tablet or smartphone, per se. (Yet.) A netbook--with a keyboard that would have put Margaret Mitchell ("Gone with the Wind") on the sidelines well before Atlanta was toasted--yes. And I've certainly been accused of shallow thinking. And not just recently, nor without justification.
Which, I'll admit, makes it seem more than a little pretentious to swim against the tide of "conventional wisdom." ('Cuz when business writers predict long-term computing trends, it's totally like, "Gartner data-point. Your argument is invalid." 'Nuff said, right?) Even against the swaggering conventionality of dudes like Mr. Allsworth--who, so far as I can tell, think they're scooping the meteor from "Fantasia" just as the dinosaurs double-take the bright light in the sky like some chorus line of "Durrrr." Because we all know how sharply striated the mainframe-to-minicomputer-to-PC adoption was, yes?
Mockery aside, I think I can safely predict that we're living in a Golden Age of niches--perchance even a Cambrian explosion of computing life-forms. Simply because hardware is cheap, software alternatives range somewhere between "cheap" and "free" and tying together systems is not limited to dedicated telephone wires--owned, I might add, by a monopoly. Making the statistical likelihood of such one-or-the-other thinking rather on part with being struck by lightning during a shark attack.
No doubt 24/7 availability of fully networked computers responsive in a more three-dimensional sense will change the equation somewhat. But the fact remains that small screens with cramped user interfaces are geared to forms of content for which a desktop in which you can immerse yourself for twelve hours straight (thanks to three monitors, keyboard, mouse and who-knows-what-besides) are thermonuclear-scale overkill.
For instance: There's snapping a photo, cropping it, tagging it, uploading it--yea even with LOL-caption. There's firing off the multi-person SMS message otherwise known as a tweet--or even skinny post. Stupid-simple, and as close to "free" (in terms of time and money) as possible for both the creator and the recipient. Then there's the longer-term commitment of content on the level of, say, "Avatar" or "Inception" Even bootlegged copies carry the cost of going on two hours of time. (And, in an economy where too many work more hours for less compensation, don't ever make the mistake of discounting the value of "idle" time!)
Seriously now...will the next Lady Gaga video be mixed on an iPad as facilely a throwaway iApp can make caricatures of your photos? Me, I'm thinking not. And not only from the standpoint of raw computing power--something that typically comes in inverse proportions to the prized battery life of such devices. A multi-screened Mac, fully accessorized, by contrast, will capture the nuances that dumbed-down resolutions and tinny, cheap earbuds will not. All the difference in the world between a handful of Facebook friends and millions of "L'il Monsters," in other words.
In short, content is not created equal. Either in the creation or the consumption, I might add. And never will be. Just like sometimes you can get by with the "fun-size" Snickers bar you poached from the communal candy jar--the calories don't count if you pitch the wrapper in your cube-mate's waste-basket. Honest--I read it in "Scientific American." But at other times nothing short of the infamous seventeen layer "death by chocolate" volcano cheesecake torte from the local Tchotchke's will do.
Or something like that.
Tuesday, June 21, 2011
Blog post Wed. night
Beekeeper's meeting followed by "Date Night" takeout from Gracie's Gyros over a bottle of retsina. And suddenly it's nearly 10:30. Which, on a "school night," means that tomorrow will be Blog Night.
P.S.: Gracie's is lovely this time of year...no fighting college students for tables!
P.S.: Gracie's is lovely this time of year...no fighting college students for tables!
Friday, June 17, 2011
Frivolous Friday, 06.17.2011: Winged muses
For the last few months, I've been either getting a project out the door, fighting fires or tying up the proverbial loose ends on nearly six years of my working life. Which doesn't leave much time for in-cubicle "play-time." Outside of work, the joy in coding--particularly having my head handed to me by another language--just wasn't there. So it was easy to let the mundane requirements of adult life take over and decompress by reading with the balance of my free time.
To me, coder's block is no different from writer's block--in terms of symptoms or treatment. Sometimes you have to force things; other times, busying yourself with something unrelated while your subconscious mind works it out. And inspiration can come from the unlikeliest places.
Take the pet store, for instance. My simple mission was to pick up a bag of finch kibble. I've never trusted a store to provide optimal housing for any birds--particularly not after the shameful conditions I saw at the PetCo north of Rochester (MN) in 2000.
So I took up my self-appointed role as Finch Inspector at Large at the Onalaska PetCo. Mixed results there--particularly where they crowded too many Spice Finches and Society Finches in the same cage. (I'll be going back tomorrow to make sure that the Spice Finch with the bald spot was, indeed put into isolation as promised. There will be Gre'thor raised with the manager if it hasn't.)
But I knew they wouldn't put the little white-capped Zebra Finch "in iso." just because it was being bullied all over the cage. And the sleek little African Silverbill was already all alone...
And I'll bet you'll never, EVER guess what happened next.
Sigh. The spirit of Charlie Brown adopting the pathetic little Christmas tree strikes again. Naturally, both birds Houdini'd out of their carrying boxes when I brought them home, which only served me & my sappiness right.
But as I settled them into their quarantine cages, I thought that this weekend would be a good idea to get a starting weight for them. As we let the flock of Gouldians live out their natural lives, we--sloppily--fell out of the habit of regular weighings. But with two new young'uns, there's no excuse for that.
Which, I realized--less that a second later--would make a splendid first Android application. And, ultimately, an equally splendid excuse to break down and splurge on a 'droid tablet--something I promised I wouldn't do before I'd written my first non-"Hello, World!"--i.e. non-trivial--Android application.
And--whaddya know?--there's my extracurricular coding mojo! Standing right there! How've ya been, old friend? Let's boogie!
(P.S.: Thank you, little ones. May your kibbles always crunch.)
To me, coder's block is no different from writer's block--in terms of symptoms or treatment. Sometimes you have to force things; other times, busying yourself with something unrelated while your subconscious mind works it out. And inspiration can come from the unlikeliest places.
Take the pet store, for instance. My simple mission was to pick up a bag of finch kibble. I've never trusted a store to provide optimal housing for any birds--particularly not after the shameful conditions I saw at the PetCo north of Rochester (MN) in 2000.
So I took up my self-appointed role as Finch Inspector at Large at the Onalaska PetCo. Mixed results there--particularly where they crowded too many Spice Finches and Society Finches in the same cage. (I'll be going back tomorrow to make sure that the Spice Finch with the bald spot was, indeed put into isolation as promised. There will be Gre'thor raised with the manager if it hasn't.)
But I knew they wouldn't put the little white-capped Zebra Finch "in iso." just because it was being bullied all over the cage. And the sleek little African Silverbill was already all alone...
And I'll bet you'll never, EVER guess what happened next.
Sigh. The spirit of Charlie Brown adopting the pathetic little Christmas tree strikes again. Naturally, both birds Houdini'd out of their carrying boxes when I brought them home, which only served me & my sappiness right.
But as I settled them into their quarantine cages, I thought that this weekend would be a good idea to get a starting weight for them. As we let the flock of Gouldians live out their natural lives, we--sloppily--fell out of the habit of regular weighings. But with two new young'uns, there's no excuse for that.
Which, I realized--less that a second later--would make a splendid first Android application. And, ultimately, an equally splendid excuse to break down and splurge on a 'droid tablet--something I promised I wouldn't do before I'd written my first non-"Hello, World!"--i.e. non-trivial--Android application.
And--whaddya know?--there's my extracurricular coding mojo! Standing right there! How've ya been, old friend? Let's boogie!
(P.S.: Thank you, little ones. May your kibbles always crunch.)
Tuesday, June 14, 2011
Living in a three-party system
Dad wanted to review the photos on his camera's memory card. Dad's a Windows user, and I don't have an SD slot on my XP desktop. So the only option was to fire up the Ubuntu laptop, log in, and hope for the best.
Now. What you should understand about my Dad is he's Mister Fix Anything, thanks to being in medical maintenance since before I was born in the hospital that employed him. Plumbing. Electrical. Mechanical. HVAC. (Computer-controlled pneumatic tube systems seem to be his special joy. That and being privy to the more colorful antics of the staff and administration--such generally being a fringe benefit of working third shift.)
He's turned his attention on desktop and laptop computers as well, so I didn't think that the Linux thing would throw him off too much. (Heck, I've managed to pass off Mandrake as Windows...and that in 2004.) Wrong. It seems that whatever software's installed on his home computer opens the root of the SD card and ignores the folder-structure to present all photos in slide-show format. To Ubuntu, however, the card was just another drive--no different from a hard drive, CD, DVD, or USB device. (Of course, the fact that his camera apparently creates a new folder for each day--which can mean a folder containing a single .JPG file--doesn't make browsing any easier.)
We fumbled through it one way or another. But it was pretty obvious that Dad was convinced that having all photos at his fingertips was The Way It's Supposed To Work, Darnitalready. His software completely concealed the folder structure (and thus the complexity) from him.
It's anecdotes like those that make me suspect--and rejoice--that operating systems will always be a multi-party system. I know my preferences, and--I fondly hope--understand some of the values that drive them. But values in operating systems--as in politics--can be incompatible to the point where compromises, while possible, have a kludgey feel to them. I more or less type for a living, and fingerprints on my monitor, frankly, skeeve me the heck out. Which pretty much rules out any tablet that doesn't dock into a keyboard. My co-worker in the pod next door, by contrast, might not be acquitted of actually naming his iPad and sleeping with it on a pillow next to him. At least not in the court of public opinion. Much the same might be said for his Android phone. And more power to him.
Which brings us back around to the politics metaphor and the moral of the story: Knowing your core values (and the trade-offs they entail) always trumps finding justification for your allegiance to a particular system.
Now. What you should understand about my Dad is he's Mister Fix Anything, thanks to being in medical maintenance since before I was born in the hospital that employed him. Plumbing. Electrical. Mechanical. HVAC. (Computer-controlled pneumatic tube systems seem to be his special joy. That and being privy to the more colorful antics of the staff and administration--such generally being a fringe benefit of working third shift.)
He's turned his attention on desktop and laptop computers as well, so I didn't think that the Linux thing would throw him off too much. (Heck, I've managed to pass off Mandrake as Windows...and that in 2004.) Wrong. It seems that whatever software's installed on his home computer opens the root of the SD card and ignores the folder-structure to present all photos in slide-show format. To Ubuntu, however, the card was just another drive--no different from a hard drive, CD, DVD, or USB device. (Of course, the fact that his camera apparently creates a new folder for each day--which can mean a folder containing a single .JPG file--doesn't make browsing any easier.)
We fumbled through it one way or another. But it was pretty obvious that Dad was convinced that having all photos at his fingertips was The Way It's Supposed To Work, Darnitalready. His software completely concealed the folder structure (and thus the complexity) from him.
It's anecdotes like those that make me suspect--and rejoice--that operating systems will always be a multi-party system. I know my preferences, and--I fondly hope--understand some of the values that drive them. But values in operating systems--as in politics--can be incompatible to the point where compromises, while possible, have a kludgey feel to them. I more or less type for a living, and fingerprints on my monitor, frankly, skeeve me the heck out. Which pretty much rules out any tablet that doesn't dock into a keyboard. My co-worker in the pod next door, by contrast, might not be acquitted of actually naming his iPad and sleeping with it on a pillow next to him. At least not in the court of public opinion. Much the same might be said for his Android phone. And more power to him.
Which brings us back around to the politics metaphor and the moral of the story: Knowing your core values (and the trade-offs they entail) always trumps finding justification for your allegiance to a particular system.
Friday, June 10, 2011
No blog post tonight
Prepping for a family reunion and having company over has called dibs on my weekend and beyond. Hope that my gentle reader's Friday is sufficiently frivolous without my contribution.
Cheers, all!
Cheers, all!
Tuesday, June 7, 2011
Lopsided metaphor
Apparently, hyperopia is a hipster condition...at least in the sense that I've never heard of it--not that this is much of a touchstone, mind you! (The less fancy name for those of us lacking a degree in opthomology is "farsightedness.")
It occurred to me to look up the word while I was testing a fix, and realized that I was doing so with full God-Emperor-of-Dune system privileges. I should note that only a meagre handful of users have that level of access. But testing (either formally or informally) with the full menu of features at hand is not always a good thing--as I discovered to my mortification years ago when a necessary feature actually generated errors for any poor, average hoi-poloi schlub unfortunate enough to have to use it. (Small wonder it was the proverbial red-headed step-child of features...y'think?!?!)
I think it's telling that I found the word "hyperopia" not thanks to a mind that's a sponge for over-educated trivia. Nor even via an educated guess based on a two-ships-passing-in-the-night acquaintance with Greek.
No, I merely Googled, "opposite of myopia." Because "myopia"--a.k.a. near-sightedness--is a term that has some layperson mileage. And that's what puzzles me. When a person or organization or culture is described as "myopic," that's invariable a bad thing. Parochial. Head-in-the-sand. That sort of thing.
But we don't have the equivalent term of people who were so busy worrying about what's going on outside the walls self/organzation/country that they forgot to take care of business. And I think you can definitely make the case that "hyperopia" is an equally deadly sin. You saw it in the "but everybody else is doing it, and if we don't they'll eat our lunch" Wall Street lemming-stampede. You see it in investing even now...by so-called professionals. You see it in the way our country cheerleads democracy and freedom and gender equality and anti-corruption efforts in other countries but thumbs its nose at them inside our own borders.
My position? Screw the Joneses. Let them lose sleep worrying about how they're going to keep up with you. Sure, pay attention to what's on the horizon and dedicate a percentage of resources to the hic sunt dracones part of the map--no question. But otherwise, there's no substitute for taking care of business.
It occurred to me to look up the word while I was testing a fix, and realized that I was doing so with full God-Emperor-of-Dune system privileges. I should note that only a meagre handful of users have that level of access. But testing (either formally or informally) with the full menu of features at hand is not always a good thing--as I discovered to my mortification years ago when a necessary feature actually generated errors for any poor, average hoi-poloi schlub unfortunate enough to have to use it. (Small wonder it was the proverbial red-headed step-child of features...y'think?!?!)
I think it's telling that I found the word "hyperopia" not thanks to a mind that's a sponge for over-educated trivia. Nor even via an educated guess based on a two-ships-passing-in-the-night acquaintance with Greek.
No, I merely Googled, "opposite of myopia." Because "myopia"--a.k.a. near-sightedness--is a term that has some layperson mileage. And that's what puzzles me. When a person or organization or culture is described as "myopic," that's invariable a bad thing. Parochial. Head-in-the-sand. That sort of thing.
But we don't have the equivalent term of people who were so busy worrying about what's going on outside the walls self/organzation/country that they forgot to take care of business. And I think you can definitely make the case that "hyperopia" is an equally deadly sin. You saw it in the "but everybody else is doing it, and if we don't they'll eat our lunch" Wall Street lemming-stampede. You see it in investing even now...by so-called professionals. You see it in the way our country cheerleads democracy and freedom and gender equality and anti-corruption efforts in other countries but thumbs its nose at them inside our own borders.
My position? Screw the Joneses. Let them lose sleep worrying about how they're going to keep up with you. Sure, pay attention to what's on the horizon and dedicate a percentage of resources to the hic sunt dracones part of the map--no question. But otherwise, there's no substitute for taking care of business.
Friday, June 3, 2011
Frivolous Friday, 06.03.2011: Funny money
It doesn't seem like it's been that long since the birth announcement, but online pal A.'s daugher Z. is now old enough for visits from the Tooth Fairy. A. tweeted that $0.50 had materialized under Z.'s pillow. I twitted him that clearly imaginary beings are not subject to the laws of inflation, because that was the going rate in "my day" --an epoch that, in broad historial terms, was bookended by Wategate and Disco--which was exactly the same.
Which brought to mind one of the (few) high points of Harvard Lampoon's send-up of Tolkein, Bored of the Rings. Quoting purely from (1995-vintage) memory, the menu of the "Feast of Orlon" was satirized thusly:
But that was 1995. When we looked back at gold-hoarding and rolled our eyes. When the idea of an exchange rate between, say, D&D GP ("gold pieces" to the uninitiated) and greenbacks would have been laughable. When the real-estate bubble that had middle class Americans snapping up ticky-tacky twin-dos to flip to similarly beady-eyed middle class Americans was even more laughable. (In retrospect, anyway.)
You can say what you like about the trench-war between eCommerce and bricks-and-mortar, but EverQuest and Diablo and their ilk definitely put "virtual goods" on the map. And to my mind, it wasn't so much that the RIAA/MPAA had to compete with "free," it's that they had to compete with broad-band download rates. Throw into the mix the fact that Apple has more or less conditioned us to value both songs and smartphone/tablet applications at 99 cents a pop. Texting to a specific number, last time I checked, anyway, was an automatic ten clams.
In perspective, one can only wonder at the unintended (or at least unanticipated by the masses) consequences cellphone payments.
This isn't the first time I've scratched my head at paying real money for not-so-real stuff. Or even that I've been appalled at the nasty, brutish underbelly of virtual economies. But it is the first time I've had the mortifying suspicion that we programmer & online gamer types might have--completely inadvertently, I swear!--have lowered the bar for everyone else.
Thus--I suspect--was born the notion of taking mortgages (issued to pretty much anyone with a pulse), spraying the regulatory equivalent of Lysol on them, and passing them off as actual, honest-to-Pete, value-adding "investments." And, because we 'mericans tend to be slow learners (as in, "Oh, but that was Manchuria; Japan would never have the stones to attack U.S. soil..."), we might just see a badly-scripted sequel in the staid fields of betting on your life expectancy.
Considering the number of platinum-parachuted Wall Street suits who happily short-sold your castle, I would think the thought of short-selling your actual...well...life should worry you, no? Because, in the grand scheme of things, insurance companies are only selling an egregiously non-tangible product (a.k.a. "peace of mind"): Am I wrong?
(Note to self: Do not encourage Dennis to splice Soylent Green into the Netflix line-up anytime soon. kthxbi.)
Part of me feels like I should apologize--on behalf of all Geekdom--for how the virtualization of goods might have actually softened up the economy for the Great Recession. But, on further review...eh, naaaaahhh. We're the ones who take our mortgage Algebra straight-up, without the sugar-froth of ARM or interest-only pixie-dust. And, perhaps more significantly, maybe it takes an immersion in economies that are fake-for-real (e.g. Monopoly, WoW, Eve) to appreciate the fine distinction between gold-farming and derivatives-brokering. Just saying.
Which brought to mind one of the (few) high points of Harvard Lampoon's send-up of Tolkein, Bored of the Rings. Quoting purely from (1995-vintage) memory, the menu of the "Feast of Orlon" was satirized thusly:
Like all mythical creatures who live in the forest with no visible means of support, the elves dined frugally on nuts, berries, bark and dirt.Needless to write, I condemned myself to English Major Hades--because we Liberal Arts types are just too literate for a monosyllabic "Hell," don'cha'know?--by snorting with laughter. Uproariously.
But that was 1995. When we looked back at gold-hoarding and rolled our eyes. When the idea of an exchange rate between, say, D&D GP ("gold pieces" to the uninitiated) and greenbacks would have been laughable. When the real-estate bubble that had middle class Americans snapping up ticky-tacky twin-dos to flip to similarly beady-eyed middle class Americans was even more laughable. (In retrospect, anyway.)
You can say what you like about the trench-war between eCommerce and bricks-and-mortar, but EverQuest and Diablo and their ilk definitely put "virtual goods" on the map. And to my mind, it wasn't so much that the RIAA/MPAA had to compete with "free," it's that they had to compete with broad-band download rates. Throw into the mix the fact that Apple has more or less conditioned us to value both songs and smartphone/tablet applications at 99 cents a pop. Texting to a specific number, last time I checked, anyway, was an automatic ten clams.
In perspective, one can only wonder at the unintended (or at least unanticipated by the masses) consequences cellphone payments.
This isn't the first time I've scratched my head at paying real money for not-so-real stuff. Or even that I've been appalled at the nasty, brutish underbelly of virtual economies. But it is the first time I've had the mortifying suspicion that we programmer & online gamer types might have--completely inadvertently, I swear!--have lowered the bar for everyone else.
Thus--I suspect--was born the notion of taking mortgages (issued to pretty much anyone with a pulse), spraying the regulatory equivalent of Lysol on them, and passing them off as actual, honest-to-Pete, value-adding "investments." And, because we 'mericans tend to be slow learners (as in, "Oh, but that was Manchuria; Japan would never have the stones to attack U.S. soil..."), we might just see a badly-scripted sequel in the staid fields of betting on your life expectancy.
Considering the number of platinum-parachuted Wall Street suits who happily short-sold your castle, I would think the thought of short-selling your actual...well...life should worry you, no? Because, in the grand scheme of things, insurance companies are only selling an egregiously non-tangible product (a.k.a. "peace of mind"): Am I wrong?
(Note to self: Do not encourage Dennis to splice Soylent Green into the Netflix line-up anytime soon. kthxbi.)
Part of me feels like I should apologize--on behalf of all Geekdom--for how the virtualization of goods might have actually softened up the economy for the Great Recession. But, on further review...eh, naaaaahhh. We're the ones who take our mortgage Algebra straight-up, without the sugar-froth of ARM or interest-only pixie-dust. And, perhaps more significantly, maybe it takes an immersion in economies that are fake-for-real (e.g. Monopoly, WoW, Eve) to appreciate the fine distinction between gold-farming and derivatives-brokering. Just saying.
Wednesday, June 1, 2011
It takes a tribe to raise a member
I'm not sure I want to perpetuate the analogy of "tribes" to a work environment, because, going on my career so far, I've rarely seen the necessary level of cohesion extend beyond, say, three co-workers.
But Best Friend H. was chunking out code when I was still struggling to parlay a Liberal Arts degree into a "real job," so I tend to trust her instincts. This past weekend, she used the phrase "tribal knowledge" in the context of griping about why outsourcing projects to contracting firms was capable of so much friction. Friction, I might add, that had nothing to do with time zones or language or even cultural norms.
That concept has been brought home the past few weeks as I've been on the short road to tinnitis, trying to block out days' worth of brain-dumping. And that's just for one application. We're trying not to traumatize the trainee, so I haven't even started loading my dump-truck just yet. (Much less the gleeful rubbing of hands, twirling of moustache, maniacal laughter, etc., etc.)
But in a much more tangible sense, I only need rewind to last Friday. I don't think I'm telling too many proverbial stories out of school when I say that a software enhancement was expected to be rolled out to the production (i.e. live) website this Tuesday, and we still needed to make a final check on the "beta" (i.e. dress rehearsal) server. Quality Assurance (QA) was already slated to test over the weekend, and Alpha-Geek was likewise planning to make the final code-push, assuming it passed muster.
I handle promotions in the afternoon, so the programmer responsible passes the issue off to me. So far the tribal knowledge seems to have trickled down, despite the fact that he's only been with us for a small handful of months.
Except the tribe missed passing on the bit of collective wisdom that says that if your patch is that important, you should probably stick around to see it safely promoted and, oh, maybe even spot-check it before handing it back off to QA. But, being a self-respecting barely-post-collegiate type, he, naturally, made for the door in anticipation of a three-day weekend.
I duly promoted the database portions, then realized that the programmer had forgotten to to merge one chunk of the code into the production branch of the Subversion repository. Hypothetically, this should be only a minimal inconvenience, because I should be able to update my copy of the "production" repository, merge the designated changes from the beta version, and commit them up.
It's an amiable enough hypothesis, to be sure. But it reckons without the fact that, between the time programmer made his change and the time I had to merge them, a directory had been deleted. Which means that Subversion had had its little drama-queen hissy-fit freakout about a tree conflict. In laypeople's terms, this leaves me with three possible options:
A.) Trust my version of the files
B.) Trust the incoming version of the files
C.) Try to negotiate a compromise
Only problem was, I wasn't privy to the "tribal knowledge" behind the "missing" folder and its contents. That had left with the programmer in question--and, for that matter, everybody else. Which left me with two possible options:
A.) Take a wild, flying stab in the dark and let other people shop-vac up any resulting damage
B.) Document what I'd done so far and the problems encountered and foist the rest of the promotion off on Alpha-Geek
Sadly, the latter was the kinder option. (From a passive-aggressive standpoint, it was probably a horse apiece, so I refuse to feel guilty about choosing (B.).) Apart from being copied on the note that Alpha-Geek had ultimately finished the promotion (after working around said freakout), I didn't hear bupkis about the incident after that. I should probably check in to see that the appropriate party was shown the error of his ways.
But the "teachable moment" from all this should, I trust, be the ample demonstration of the value of cultivating knowledge as a tribe. Or, perhaps more aptly, the counter-wisdom of grafting individuals (and even whole teams) onto a project without thoroughly marinating her/him/it in the deep end of the knowledge-pool. The best possible outcome otherwise is gross inefficiency; the worst is wholesale disaster. The foolhardy grafter should expect no sympathy in either case.
But Best Friend H. was chunking out code when I was still struggling to parlay a Liberal Arts degree into a "real job," so I tend to trust her instincts. This past weekend, she used the phrase "tribal knowledge" in the context of griping about why outsourcing projects to contracting firms was capable of so much friction. Friction, I might add, that had nothing to do with time zones or language or even cultural norms.
That concept has been brought home the past few weeks as I've been on the short road to tinnitis, trying to block out days' worth of brain-dumping. And that's just for one application. We're trying not to traumatize the trainee, so I haven't even started loading my dump-truck just yet. (Much less the gleeful rubbing of hands, twirling of moustache, maniacal laughter, etc., etc.)
But in a much more tangible sense, I only need rewind to last Friday. I don't think I'm telling too many proverbial stories out of school when I say that a software enhancement was expected to be rolled out to the production (i.e. live) website this Tuesday, and we still needed to make a final check on the "beta" (i.e. dress rehearsal) server. Quality Assurance (QA) was already slated to test over the weekend, and Alpha-Geek was likewise planning to make the final code-push, assuming it passed muster.
I handle promotions in the afternoon, so the programmer responsible passes the issue off to me. So far the tribal knowledge seems to have trickled down, despite the fact that he's only been with us for a small handful of months.
Except the tribe missed passing on the bit of collective wisdom that says that if your patch is that important, you should probably stick around to see it safely promoted and, oh, maybe even spot-check it before handing it back off to QA. But, being a self-respecting barely-post-collegiate type, he, naturally, made for the door in anticipation of a three-day weekend.
I duly promoted the database portions, then realized that the programmer had forgotten to to merge one chunk of the code into the production branch of the Subversion repository. Hypothetically, this should be only a minimal inconvenience, because I should be able to update my copy of the "production" repository, merge the designated changes from the beta version, and commit them up.
It's an amiable enough hypothesis, to be sure. But it reckons without the fact that, between the time programmer made his change and the time I had to merge them, a directory had been deleted. Which means that Subversion had had its little drama-queen hissy-fit freakout about a tree conflict. In laypeople's terms, this leaves me with three possible options:
A.) Trust my version of the files
B.) Trust the incoming version of the files
C.) Try to negotiate a compromise
Only problem was, I wasn't privy to the "tribal knowledge" behind the "missing" folder and its contents. That had left with the programmer in question--and, for that matter, everybody else. Which left me with two possible options:
A.) Take a wild, flying stab in the dark and let other people shop-vac up any resulting damage
B.) Document what I'd done so far and the problems encountered and foist the rest of the promotion off on Alpha-Geek
Sadly, the latter was the kinder option. (From a passive-aggressive standpoint, it was probably a horse apiece, so I refuse to feel guilty about choosing (B.).) Apart from being copied on the note that Alpha-Geek had ultimately finished the promotion (after working around said freakout), I didn't hear bupkis about the incident after that. I should probably check in to see that the appropriate party was shown the error of his ways.
But the "teachable moment" from all this should, I trust, be the ample demonstration of the value of cultivating knowledge as a tribe. Or, perhaps more aptly, the counter-wisdom of grafting individuals (and even whole teams) onto a project without thoroughly marinating her/him/it in the deep end of the knowledge-pool. The best possible outcome otherwise is gross inefficiency; the worst is wholesale disaster. The foolhardy grafter should expect no sympathy in either case.
Subscribe to:
Posts (Atom)