I'm finding I comedy works best to keep me from dwelling on how much longer I'm going to be on an elliptical machine, treadmill or what-have-you. So earlier this week it was an old standby, Office Space. (If you haven't seen it, suffice it to say that it's sort of a cult classic for programmers.) Coincidentally, this was also the same week that someone decided to riff on one of the movie's plot-points and steal fellow programmer's red Swingline stapler. Twice.
I polished off Office Space and turned back to Monty Python and the Holy Grail, riffs from which are unavoidable in the SCA. That'd be like trying to play golf without at least one wink-wink-nudge-nudge reference to Caddyshack.
Bad enough that Dennis & I have already trained each other to phrase "or"-type questions (as in, "Do you want four cheese or meat-lover's supreme?") without expecting the answer to be an obligatory "Yes." Or "True" or "1" if someone's feeling exceptionally nerdy. But then I made the mistake of remembering the phrase, "Darmok and Jalad at Tanagra."
(If you're not a Star Trek: The Next Generation maven, here's the schtick: The Federation has bumped into (yet another) alien race that (surprise!) just happens to be recognizably humanoid. Moreover, the Universal Translator can even babblefish--yes, I just used that as a verb-- their language into English words. Problem is, it still doesn't make sense, because the Tamarians exclusively communicate allegorically--meaning through references to stories from their history. Think of it as tribal knowledge on steroids.)
At first I thought, well, we're not that bad. But then I realized--particularly after being chagrined at how much of the "Brave, Brave Sir Robin" song I've forgotten--that any nerdery is a continuum. Meep! Ummm...how many restroom stops until Tanagra?
Thoughts on computers, companies, and the equally puzzling humans who interact with them
Friday, September 30, 2011
Tuesday, September 27, 2011
Ultimatum
Maybe it's that I've been reading too much non-history, non-fiction lately. Or maybe the topics are just too...shall we say...inbred. But I've bumped into enough mentions of a game called "Ultimatum" that it's stuck with me. The word "game" is a misnomer, at least in the sense that Ultimatum is nothing you'll find keeping Monopoly company on the closet shelf. It's actually played by those who've volunteered for psychological studies in universities and other institutions that study human interactions in factor-controlled circumstances.
The basic premise starts with two people. Person A is given a fixed dollar amount (usually ten bones in the cited examples) to be split with the other person. There is no negotiation--Person A makes a take-it-or-leave-it offer for Person B. The catch is that if Person B refuses, each person receives zero.
The classical economics they teach you in high school and college would predict that even if Person A offered Person B one cent and kept the remaining $9.99 for her/himself, Person B would still have a penny more than s/he had before, and would therefore accept. Because something is always better than nothing, riiiight???
As it turns out, capital-R reality doesn't exist to fulfil the premises of classical economic thought any more than it does, say, story problems in Math. Because the ultimate result was that a low-ball offer basically meant that Person B had very little to lose, either. And, maybe it's just because the Puritans gained such an early toe-hold in the American psyche, but the impulse to punish high-handed greed is fairly strong, too. In practice, 50-50, 60-40, and even 70-30 splits had a fairly high likelihood of being accepted. But once a threshold of "unfairness" was crossed...not so much.
The metaphor to the current state of the U.S. economy (and political state) seems all-too-obvious...
If, en masse, the American worker/consumer walks away from the deal, it might actually be good on some levels. Among them decreasing personal debt and a mom-n-pop entrepreneurial boom, and maybe--just maybe--an increased focus on quality of life. But apart from that...boom. The pity is that those who play the role of Person A in this "game" they're playing will not walk away with nothing. At best, they'll be less-rich. Once again demonstrating how freakishly carefully-controlled lab results can mutate in the wild.
The basic premise starts with two people. Person A is given a fixed dollar amount (usually ten bones in the cited examples) to be split with the other person. There is no negotiation--Person A makes a take-it-or-leave-it offer for Person B. The catch is that if Person B refuses, each person receives zero.
The classical economics they teach you in high school and college would predict that even if Person A offered Person B one cent and kept the remaining $9.99 for her/himself, Person B would still have a penny more than s/he had before, and would therefore accept. Because something is always better than nothing, riiiight???
As it turns out, capital-R reality doesn't exist to fulfil the premises of classical economic thought any more than it does, say, story problems in Math. Because the ultimate result was that a low-ball offer basically meant that Person B had very little to lose, either. And, maybe it's just because the Puritans gained such an early toe-hold in the American psyche, but the impulse to punish high-handed greed is fairly strong, too. In practice, 50-50, 60-40, and even 70-30 splits had a fairly high likelihood of being accepted. But once a threshold of "unfairness" was crossed...not so much.
The metaphor to the current state of the U.S. economy (and political state) seems all-too-obvious...
- As banks sit on hundreds of billions of dollars of bailout-backed credit
- As corporations hoard even more than that in profits, waiting for someone else to create the jobs...and demand for their products.
- As pay is not so much a fraction as it is a logarithmic base of productivity
- As the cost of a college degree rises in tandem with offshoring and union-busting
- As pernicious unemployment and foreclosure rates undermine consumer confidence...and spending
- As we expect an entrepreneurial "creative class" to spontaneously emerge from generations taught to standardized tests
- As gerrymandering and astro-turfing polarize the electoral landscape
- As the concentration of wealth into a shrinking pool of bank accounts further tilts the political and legal table
If, en masse, the American worker/consumer walks away from the deal, it might actually be good on some levels. Among them decreasing personal debt and a mom-n-pop entrepreneurial boom, and maybe--just maybe--an increased focus on quality of life. But apart from that...boom. The pity is that those who play the role of Person A in this "game" they're playing will not walk away with nothing. At best, they'll be less-rich. Once again demonstrating how freakishly carefully-controlled lab results can mutate in the wild.
Friday, September 23, 2011
Frivolous Friday, 09.23.2011: Close enough for government work
Slashdot today ran a piece about the U.S. Government paying its own programmers half the going rate for contract programmers. The comments, at least early-on when I read them, tended to focus on the premise that contract programmers are paid extra to, well, go away on short notice. (I've worked as a "temp"--high-tech flunkie as well as office minion--and, frankly, I have no idea where that notion comes from, at least not if a temp. agency is involved.)
Me, I'd tend to place the discrepancy at the intersection of hiring freezes and the spend-it-or-have-your-budget-slashed-next-year school of fiscal "management" that I've seen in the private sector as well.
But speculation, however plausibly grounded in past experience is not the point. Combine nerdy quirkiness and stupefying levels of through-the-looking-glass bureaucratic "logic," and the reasons could well fall outside the pale of our workaday norms. The most likely of those, to my way of thinking, include:
1.) Well, duh: People from the outside are always smarter
2.) Legendary public sector "job security" includes cubicle in lead-lined bunker and cryogenic suspension in the event of thermonuclear Armageddon
3.) Pay comparison doesn't take into account standard government-issue solid gold laptops
4.) Coders willing to take lower pay to develop "secret government technology" cachet irresistable to fellow geeks of the preferred gender
5.) Pay differences easily offset by illegal kickbacks from soda and energy drink vendors
6.) Former college interns didn't notice the "indentured servitude" clause in their NDAs
7.) Government I/T departments are the digital tar-pits where old COBOL and VB6 programmers go to die
8.) Uncle Sam's coders are rented out as cheap off-planet labor for our secret extraterrestrial allies--and neural implants don't grow on trees, you know
9.) Daily flogging and haranguing by Grover Norquist & Tea Party to destroy self-worth
10.) Once-in-a-lifetime chance to hack Andrews Air Force Base and take Air Force One out for a joyride
Me, I'd tend to place the discrepancy at the intersection of hiring freezes and the spend-it-or-have-your-budget-slashed-next-year school of fiscal "management" that I've seen in the private sector as well.
But speculation, however plausibly grounded in past experience is not the point. Combine nerdy quirkiness and stupefying levels of through-the-looking-glass bureaucratic "logic," and the reasons could well fall outside the pale of our workaday norms. The most likely of those, to my way of thinking, include:
1.) Well, duh: People from the outside are always smarter
2.) Legendary public sector "job security" includes cubicle in lead-lined bunker and cryogenic suspension in the event of thermonuclear Armageddon
3.) Pay comparison doesn't take into account standard government-issue solid gold laptops
4.) Coders willing to take lower pay to develop "secret government technology" cachet irresistable to fellow geeks of the preferred gender
5.) Pay differences easily offset by illegal kickbacks from soda and energy drink vendors
6.) Former college interns didn't notice the "indentured servitude" clause in their NDAs
7.) Government I/T departments are the digital tar-pits where old COBOL and VB6 programmers go to die
8.) Uncle Sam's coders are rented out as cheap off-planet labor for our secret extraterrestrial allies--and neural implants don't grow on trees, you know
9.) Daily flogging and haranguing by Grover Norquist & Tea Party to destroy self-worth
10.) Once-in-a-lifetime chance to hack Andrews Air Force Base and take Air Force One out for a joyride
Tuesday, September 20, 2011
"Training the Trainer," revisited
Those Who Know Best asked me to train our Client Services folks on "my" application. Cross-pollination, to be sure--just more in the sense of folks in lab coats and latex gloves brushing pollen off carefully selected plant and brushing it on on an equally selected other plant.
But those were the extent of the specifications, leaving me to fill in the details. Which, naturally involved bribes with food, wine, chocolate, and randomly sorting the competing teams into their Hogwarts houses. That was to make up for the pre-class quiz that they were really good sports about. The first half assembled in the big conference room yesterday for the actual hands-on session.
No worries...I was ready with easily two hours of material to cover, during which Hufflepuff, Ravenclaw, Slytherin and Gryffindor would take turns at the console doing actual client-type stuff on a test system. In my experience, that lends itself to questions far more than having features demonstrated to you.
What actually happened was that we quickly realized that the way they support the clients on their application is not at all how I support mine. Most notably, when there's a problem, I'm generally sticking my head straight head the database itself. Client Services, on the other hand, relies on the interface. Partly because those tools have been built for them all these years, and partly because some don't have the software nor a knowledge of SQL (Structured Query Language), much less any idea of how data fits together. Some, particularly the most senior folks do, and I had made the shaky assumption that those skills were acquired by the usual on-the-job organizational osmosis.
Wrong assumption, obviously. Which, for anyone presenting, just might trigger a freak-out because the agenda had suddenly evaporated. Which normally means pulling the plug on the whole thing or completely free-wheeling. Both are valuable meeting skills. But then the questions started flying thick as, for lack of a fresher phrase, two worlds collided.
And you know what? It was straight awesome. The balance of the two hours zipped by as I was grilled and in turn tried to get into their heads. Sure, occasionally we'd dip into the software to illustrate something. But for the most part it was meta-information: What the overall client relationships are like, some of the frustrations of working in a distributed development environment (instead of the one-stop-geek that is me), what the process is like on the client side. Those kinds of things.
I'd do it all over again...and I may just have that chance when I work with the second crew a week from tomorrow. I can only look forward to the instructive chaos that will bring.
But those were the extent of the specifications, leaving me to fill in the details. Which, naturally involved bribes with food, wine, chocolate, and randomly sorting the competing teams into their Hogwarts houses. That was to make up for the pre-class quiz that they were really good sports about. The first half assembled in the big conference room yesterday for the actual hands-on session.
No worries...I was ready with easily two hours of material to cover, during which Hufflepuff, Ravenclaw, Slytherin and Gryffindor would take turns at the console doing actual client-type stuff on a test system. In my experience, that lends itself to questions far more than having features demonstrated to you.
What actually happened was that we quickly realized that the way they support the clients on their application is not at all how I support mine. Most notably, when there's a problem, I'm generally sticking my head straight head the database itself. Client Services, on the other hand, relies on the interface. Partly because those tools have been built for them all these years, and partly because some don't have the software nor a knowledge of SQL (Structured Query Language), much less any idea of how data fits together. Some, particularly the most senior folks do, and I had made the shaky assumption that those skills were acquired by the usual on-the-job organizational osmosis.
Wrong assumption, obviously. Which, for anyone presenting, just might trigger a freak-out because the agenda had suddenly evaporated. Which normally means pulling the plug on the whole thing or completely free-wheeling. Both are valuable meeting skills. But then the questions started flying thick as, for lack of a fresher phrase, two worlds collided.
And you know what? It was straight awesome. The balance of the two hours zipped by as I was grilled and in turn tried to get into their heads. Sure, occasionally we'd dip into the software to illustrate something. But for the most part it was meta-information: What the overall client relationships are like, some of the frustrations of working in a distributed development environment (instead of the one-stop-geek that is me), what the process is like on the client side. Those kinds of things.
I'd do it all over again...and I may just have that chance when I work with the second crew a week from tomorrow. I can only look forward to the instructive chaos that will bring.
Friday, September 16, 2011
Tuesday, September 13, 2011
Attention Inflation Disorder
We had an interesting bit of "training" over the lunch hour today. One of the deep-thinkers wired himself into our large conference room via two-way webcam, and--unloaded a couple decades of experience on us, which included the pendulum-swings between centralized and distributed computing fads, and also the dead-wrong predictions/assumptions committed by even the most forward-thinking the technorati.
For me, the money-quote was the prediction of an "information economy." Our guest re-cast that instead as an "attention economy," on the premise that information is only valuable if someone reads/views/hears (and, I would add, acts upon) it. Our colleague also theorized about our obsession with glowing rectangles (phones, tablets), and the apparent necessity of maxing out our attention bandwidth when it's not satisfied with the work and people and general doings around us.
Those two notions (attention economy and voluntary information saturation) kind of meshed into the notion that, in terms of classical economics, we're voluntarily debasing our own currency. (Most especially when those brain-CPUs are in paparazzi or "Farmville" spaces.) I suppose it wouldn't be a big deal if Moore's Law and the general premises of computing applied to the think-meat between our ears. Presumably then we could evolve to a state where our internal process monitors looked something like:
30% - Curing Disease
30% - Ending Poverty & Injustice
30% - Saving the Planet
0.0001% - How long are those eggs in the 'fridge okay after their expiration date?
9.9999% - OOOOH--SPARKLY BALL OF TIN FOIL!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Me, I'm not holding my breath. But in the absence of unprecendented rates of brain evolution (or neural augmentation), our techniques for managing divided attentions will have to evolve to take up the slack. And, sadly, I'm not holding out any more hope for that, either. Not after seeing how stubbornly mainstream corporate culture invests in tired carrot-and-stick paradigms, years of disconnect between worker productivity and pay notwithstanding. Sigh.
But such cognitive fragmentation is something we need to start acknowledging in our work lives--and devising coping strategies for its corrosiveness. Particularly in my profession, where one is expected to toggle between blinders-on, deep-dive focus and collaborative brain-pooling in such an immediate and binary fashion. Anything less is living in denial. And, in the long run, the cost of living in that zip code is higher than anywhere else on earth.
For me, the money-quote was the prediction of an "information economy." Our guest re-cast that instead as an "attention economy," on the premise that information is only valuable if someone reads/views/hears (and, I would add, acts upon) it. Our colleague also theorized about our obsession with glowing rectangles (phones, tablets), and the apparent necessity of maxing out our attention bandwidth when it's not satisfied with the work and people and general doings around us.
Those two notions (attention economy and voluntary information saturation) kind of meshed into the notion that, in terms of classical economics, we're voluntarily debasing our own currency. (Most especially when those brain-CPUs are in paparazzi or "Farmville" spaces.) I suppose it wouldn't be a big deal if Moore's Law and the general premises of computing applied to the think-meat between our ears. Presumably then we could evolve to a state where our internal process monitors looked something like:
30% - Curing Disease
30% - Ending Poverty & Injustice
30% - Saving the Planet
0.0001% - How long are those eggs in the 'fridge okay after their expiration date?
9.9999% - OOOOH--SPARKLY BALL OF TIN FOIL!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Me, I'm not holding my breath. But in the absence of unprecendented rates of brain evolution (or neural augmentation), our techniques for managing divided attentions will have to evolve to take up the slack. And, sadly, I'm not holding out any more hope for that, either. Not after seeing how stubbornly mainstream corporate culture invests in tired carrot-and-stick paradigms, years of disconnect between worker productivity and pay notwithstanding. Sigh.
But such cognitive fragmentation is something we need to start acknowledging in our work lives--and devising coping strategies for its corrosiveness. Particularly in my profession, where one is expected to toggle between blinders-on, deep-dive focus and collaborative brain-pooling in such an immediate and binary fashion. Anything less is living in denial. And, in the long run, the cost of living in that zip code is higher than anywhere else on earth.
Sunday, September 11, 2011
Silly Sunday, 09.11.2011: A reorg. poem
The rumor mill is humming
As it's running at full bore:
Word from the grapevine says that
A reorg. is in store.
And soon The Powers That Be
Are citing "re-alignment,"
Which for us can only mean
One thing: Reassignment.
So boxes now we scavenge,
Then we pack up all our stuff
And cough and sneeze amidst the haze
Of dust and lint and fluff.
Windex and compressed air
At each cube make a stop:
Coffee-rings fade, keyboards harvest
Crumb-farm bumper-crops!
Our PCs we then power down
And unsnarl spaghetti-wires;
Desktop Support is too swamped
With fighting bigger fires.
Traffic crowds the elevators
The hallways and the stairs
As we ferry stacks of boxes
And monitors on chairs.
We read our mail by smartphone
And quell the urge to thwack
That fool in our new office who
Has not begun to pack.
Greetings, my new cube-mate:
No doubt we'll get on fine--
So long as you keep to your half
Like I will keep to mine.
New roles and hats are donned
Tho' the going starts out slow.
And as the dust yet settles,
One thing I claim to know:
In my annual review
Surely this I did not mean
When I said that I could use
A little change of scene.
As it's running at full bore:
Word from the grapevine says that
A reorg. is in store.
And soon The Powers That Be
Are citing "re-alignment,"
Which for us can only mean
One thing: Reassignment.
So boxes now we scavenge,
Then we pack up all our stuff
And cough and sneeze amidst the haze
Of dust and lint and fluff.
Windex and compressed air
At each cube make a stop:
Coffee-rings fade, keyboards harvest
Crumb-farm bumper-crops!
Our PCs we then power down
And unsnarl spaghetti-wires;
Desktop Support is too swamped
With fighting bigger fires.
Traffic crowds the elevators
The hallways and the stairs
As we ferry stacks of boxes
And monitors on chairs.
We read our mail by smartphone
And quell the urge to thwack
That fool in our new office who
Has not begun to pack.
Greetings, my new cube-mate:
No doubt we'll get on fine--
So long as you keep to your half
Like I will keep to mine.
New roles and hats are donned
Tho' the going starts out slow.
And as the dust yet settles,
One thing I claim to know:
In my annual review
Surely this I did not mean
When I said that I could use
A little change of scene.
Friday, September 9, 2011
Post post-poned
Home from work now, headache in tow. Tomorrow is another day, still busy, but hopefully more in the pro-active sense.
Cheers, all!
Cheers, all!
Tuesday, September 6, 2011
(Yet another) Sign of the times
I tried to log in to Twitter earlier today, only to be greeted by the trademark Fail Whale and the uncharacteristic (of late, anyway) message that the web's foremost ADOS application was "over capacity."
But rather than immediately roll my eyes over scalability growing-pains, my first thought was to wonder where the earthquake/tsunami/hurricane/tornado/revolution had hit. (Or, more cynically, which overrated celebrity had died.)
Of course, nothing of the kind had happened (at least not anywhere off the "Hic Dracones Sunt" area of the American mind-map.) But I thought the fact that it was my first instinct to assume that the disaster was outside Twitter's server-room--and the fact that I didn't question this until some time later--was interesting. If I'm not alone, then I think Twitter should be congratulated on a serious milestone. (Good job, y'all.)
But rather than immediately roll my eyes over scalability growing-pains, my first thought was to wonder where the earthquake/tsunami/hurricane/tornado/revolution had hit. (Or, more cynically, which overrated celebrity had died.)
Of course, nothing of the kind had happened (at least not anywhere off the "Hic Dracones Sunt" area of the American mind-map.) But I thought the fact that it was my first instinct to assume that the disaster was outside Twitter's server-room--and the fact that I didn't question this until some time later--was interesting. If I'm not alone, then I think Twitter should be congratulated on a serious milestone. (Good job, y'all.)
Friday, September 2, 2011
No post tonight
I'm afraid I haven't been feeling too frivolous the last few days, and tonight is anything but the start of a languorous three-day holiday weekend.
Hope everyone is back home by Tuesday, safe, sane and not too sunburned.
Cheers, all!
Doreen
Hope everyone is back home by Tuesday, safe, sane and not too sunburned.
Cheers, all!
Doreen
Subscribe to:
Posts (Atom)