Friday, January 28, 2011

Frivolous Friday, 01.28.2011: History looms

I've been on a costuming kick lately, and Dennis gave me a rather extravagant present in the form of an inkle loom. My original intent had been to try tablet-weaving, until I realized that inkle weaving would involve fewer moving parts. This--in my case, at least--is a certified Very Good Thing. (Mind you that, over the years, I've managed to learn to embroider, hand-sew, crochet, tat, use a lucet and even fumble my way through netting. But in all those cases, I only have to think about one needle, shuttle or what-have-you. Ambidextrous I am not.)

Sad to report, my first attempt at warping it--meaning stretching the long threads around the pegs--didn't exactly go swimmingly. Guess who missed a key point in the (illustrated) instructions? Why, yes: That would be the former tech. writer. (Got it in one, Gentle Reader, got it in one...)

I made some jest about it on Facebook to the effect that my infinitely-more-competent-with-textiles foremothers would be disgraced. But my old Forensics Team "mentor" (and fellow geek) added a few comments involving Jacquard Looms (arguably the world's first "programmable" machines) and the Luddite rebellion (which involved looms), noting that King Ludd himself would be smiling upon me for my manual methods.

Booyah for FT Mentor, finding the nerdy dimension! I could just hug him.

A few weeks on--otherwise known as "last night"--I gave it another shot. This time I managed not to screw up the shed. But I did biff the pattern-planning, forgetting that there must always be an odd number of warp threads--which means that patterns are typically calculated from a center point (the middle thread). Thus, the second attempt is basically lop-sided by one thread. Then, too, it took me something short of a foot of weave before I started pulling the weft in tight enough to form the intended pattern.

Which is when it really hit me that, hey, this has math! Granted, not as much math as card-weaving has, but math nevertheless. Which, in this case, is not so much a matter of "knowability"--in other words, responding to fixed rules deduced from First Principles--so much as feedback: "I know this is correct when I see it like this." See, when you block the pattern out on graph paper, you're accounting for all the threads, and that's grossly misleading because, at any given time, about half will be forced to the "top" of the weave and the other half or so will be pushed down out of sight.

Which, in essence, means I could write a simple little web application that would allow the user to specify colors for all threads and then programatically generate the pattern. With a little modification, it would work for tablet-weaving as well. Personally, I like to think that Lady Ada* would smile upon such efforts.

- - - - -

* "The distinctive characteristic of the Analytical Engine, and that which has rendered it possible to endow mechanism with such extensive faculties as bid fair to make this engine the executive right-hand of abstract algebra, is the introduction into it of the principle which Jacquard devised for regulating, by means of punched cards, the most complicated patterns in the fabrication of brocaded stuffs. It is in this that the distinction between the two engines lies. Nothing of the sort exists in the Difference Engine. We may say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard-loom weaves flowers and leaves."

- from Ada Lovelace's notes, appended to her translation of Luigi Menebrea's article on Charles Babbage's Analytical Engine, circa 1843

Tuesday, January 25, 2011

A tale of Merlot and meta-data

I stumbled across a mention of data-scraping in wine-media maven @RickBakas's Twitter stream tonight. Being who & what I am, the intersection of two of my passions meant I had to dig further.

As a disclaimer, though, let's just say that my "contributions" to wine's social media dimension has been slim. I normally watch Wine Library TV via podcasts (and thus qualify as a "lurker," and I had a very dusty account on the probably soon-to-be-defunct Corked.com wine review site. The vast majority of the wine I consume is homebrew, so I'm even more of a misfit in a world that probably has Coke vs. Pepsi type squabblings between devotees of Sancerre and fans of Puilly Fume *.

Long story short: I don't have a horse in this race.

So. The backstory is that there's a website called CellarTracker. It, from what I can see from a superficial perusing, allows those who have too many wines to remember to track the wine in their cellar--presumably before it becomes vinegar from age. Additionally--and this is the juicy part--it's basically a database of wine-tasting notes, plus other amenities you'd expect from a social website.

Then there's this other website called Snooth. It's also about wine. It's admittedly prettier, though much stalkier ("Hello La Crosse wine lover."), and does a reasonable job of impersonating a wine magazine. As part of its aggregation backend, however, Snooth trolls the web, looking for wine-related content, presumably ranks its relevance according some algorithm, and then moshes & categorizes (tags) it into something its users can use to search for relevance.

There's certainly a value-add in the filtering process, assuming that the false positives and false negatives are absolutely and ruthlessly minimized. That's been Google's bread-and-butter for a decade now, and (remembering what search was in the 90s) I'll be the last to complain about that business model. Snooth might have other ethical issues to contend with, but blatantly "stealing" data isn't one of them.

Yesterday, a windstorm-in-a-wineglass arose with the charges that Snooth had been scraping tags (i.e. the descriptive categories) from CellarTracker, in violation of their agreement not to do the same after the companies ended their partership in 2007. I'm sure my gentle reader will be shocked to learn that oenophiles can talk trash as well as anyone else, although perhaps with somewhat better sentence construction--probably comes from having to learn all that French, I suspect. ;-)

Today's development was an apology with explanation of how the oversight occurred from Snooth's Philip James. It's quite well done--to the point where it would do well as a fair example for college business and marketing majors to absorb. To her credit, @JancisRobinson (Master of Wine, travel maven and all-around class act) tweeted the apology to her 55K+ followers. Hopefully that'll tone down any lingering rhetoric.

But in the aftermath, my gut says that this is just a taste--a sip, if you will--of what's to come. Notice in this case that the bone of contention wasn't the wine reviews themselves, it was the meta-data--i.e. the thing that makes the web more than just a collection of single words. As more and more folks cross-post, recommend, rank, re-tweet, comment, follow, friend, fan, etc., the meaning will be found in the meta-data--the underlying relevance & whatever meanings can be teased from those connections and groupings. And that's even without the extra dimensions of geolocation.

Having content worth linking to, favoriting or whatever still matters--no question. But the money's in the meta.

- - - - -

* The Sancerre and Puilly Fume wine-making regions are basically across the river from each other in France's Loire valley, and both are made from the Sauvignon Blanc grape. But in wine as in religion, the narrowest differences make for the widest schisms. Or at least it sure seems that way...

Friday, January 21, 2011

Frivolous Friday, 01.21.2001: Monkey-chow

As "highs" go, there seem to be two that most geeks to live for. Mind you, I'm basing that "most" on a sample based on my fellow programmers and I/T folks. The winner, by far, is complete immersion in the parts that fascinate them--be it exercising mastery of the domain or pushing the buttons of a shiny new thing to see what makes it work. The second is the "high" that comes that wrestling with a gremlin and finally prevailing.

No, what I'm talking about tonight really falls into the category of B-list "highs" for geeks. It's intended mostly for those whose work or domestic tranquility depends in some part on recognizing these symptoms. If you already grok the immersion and de-gremlin bits above, congratulations: You're already ahead of the curve.

So, in no particular order, such "symptoms" include:
  • Feedback main-lining If you see a geek type an arcane incantation into a black screen and seem to scrutinize the oodles and oodles of text that scrolls up the screen and off the radar--with complete grasp of its details--never mind how fast it disappears--you're witnessing an itty-bitty little geek "high." The computer has been water-boarded into divulging everything it knows. And, particularly for geeks who came of age in the 80s and early 90s, there is nothing so primally satisfying.
  • The "Master of the Universe" dashboard Just this morning, our formidible SysAdmin--and, no, you still won't hire him away from us if you know what's good for you--showed me a third-party tool that allows him to access/manage all the servers at once. Which, admittedly, was pretty darned cool. Mind you, at the time, I really only needed him to rename a couple folders, rescue a couple of accidentally deleted files, and bump up my permissions to a database. But I guess he had to show somebody, and--apparently--a recovering SysAdmin wannabe was good enough.
  • Gratuitous hardware Sure, most software doesn't insist on octal-core processors, triple-digit gigabytes of RAM, much less hard drive storage capacity that would bug out the eyes of any futuristic science fiction show ever? It's not what you actually do with it--before the next upgrade, anyway--it's what you can do. If a freak lightning strike takes out NASA's control center minutes before the the Space Shuttle is due to land, that intrepid (and hot!) intern can jack into your system and together you can save the day--and that's all that matters when the credits roll, baby!
  • Things that Light Up Personally, I think this is something of a revenant--for all I know insinuated into our little geeky brains via Saturday morning cartoons. But, really, how can you not pay attention to the LED on your USB drive when it's backing up data? Or, similarly, if the light's not the flashy kind--say, on your laptop's power-brick--how can you not avert your eyes after unplugging or resist the urge to cradle it against you as the little light fades, crooning, "Hang in there, help is coming," as the light dies? (If not, what kind of anthropomorphizer are you, I want to know?!?!)
  • The Need for Speed Mind you, I don't consider it any coincidence that our office Alpha-Geek used to race cars and just happens to be the one who seems to delight most in performance improvements. Nuh-uh. So what if the application was up on blocks while the clients were climbing the walls? Can't they see how much faster it runs now? (Don't mind the bondo and primer--that'll buff out...)
  • Connoisseuship of Cross-referential Arcana By which I mean "inside jokes" found outside their natural habitat. By which I further mean collating and publishing all known instances. Prime example, courtesy of @afstanton: Joss Whedon's "Han Solo in carbonite" knick-knack having a recurring role in Firefly. Or the sly little Lord of the Rings nuances in Babylon 5 (or its short-lived successor Crusade). (Or for that matter, the crudely rendered X-Files schtick in the afore-mentioned Crusade). Because who wants to be the hayseed who's getting by on memorizing all the Lexx scripts? Cosmopolitan geeks FTW!
This doesn't presume to be a thorough list--not even in a sense that would mollify most statisticians. But if you've seen any of the above behavior, chances are good that you're living and/or working in proximity to a bona-fide geek: Ave et cave.

Tuesday, January 18, 2011

Heretical thoughts on process

When you say you're focusing on process, what I (almost always) hear is: "We can't debug our communication problem(s)."

Although, in some organizational dialects, the translation is closer to "We need more rules to cover all the exceptions we have to make."

Or, in Middle Managementese, it can mean, "I need to show my boss that I understand what it is you people do around here (and that I'm in control of it)."

At the "straw boss" level, the understanding might be something akin to, "I don't have time to do my job and referee spitting-matches over turf besides."

On the lowest rungs, process can too easily become a security blanket: "Don't make me make decisions that I could be capriciously punished for!"

Now, I don't have a problem with process when it's virtually indistinguishable from communication. And in the term "communication," I also mean the organizational semaphore that means, "Hey, my part of this job is ready for you--go get'em tiger!"

But when process becomes less than an artifact of the most sustainable, stress-free way of getting the product out the door? Problem. Why? Because processes themselves too easily become the bulwark against having to look change straight in the eyeballs. Which makes about as much sense as Prohibition in the middle of The Roaring Twenties, I'll grant you. But an all-too-large percentage of the population have a knack for mistaking laws for mores.

Substitute "process" for laws and "corporate culture" for mores, and you'll have a sense of why tying process to product is such poor management. People come and go, which by necessity puts the culture (and its organic communications flow) in constant flux. Moreover, the ultimate end of process is product, and in my trade, the product is in flux as well. Which is the crux of the problem, particularly when process is merely another outlet for politics.

The moral of the story is to rely on process as sparingly as absolutely possible. And never, ever mistake it for management--much less real leadership.

Friday, January 14, 2011

Frivolous Friday, 01.14.2011: Song of the shovel

There's a music that's made
When a white velvet falls
To lay down fresh carpets
For the Winter Queen's halls

Sadly, its melody
Woos no listener's ear,
For its tunesmiths intend
That none other should hear

Its rasps nor its scrapings
Nor its snuf'lings of nose,
As the mercury drops
And a western wind blows.

Yet on plays this minstrel
Under veiled winter moon
(Though she with two handles
Could not carry a tune),

For she thinks on the Spring,
Of lilacs and sparrows--
As the banks pile higher
And blacktop drive narrows.

Then Sisphysian toils
Have some ending in sight
(Though the plow-rolls rebuild
Themselves during the night).

So the minstrel meanwhile
With her shovel plays on,
Paying the Winter Queen
Homage of labor's song.

Tuesday, January 11, 2011

NIH, revisited

No, not "NIH" as in "National Institute of Health," actually. What I'm talking about is the "Not Invented Here" syndrome, first (for me, anyway) outlined in a Joel On Software blog post. (No relation to the web comic of the same name. At least, not as far as I know...)

The twist on the idea that occurred to me today was: It's not even a question of trusting the borrowed/purchased code when your marching orders consist of:

1.) Copy.
2.) Paste.
3.) Edit.
4.) Profit!

Rather, it's a question of what I call "borrowed expectations." Because Step #4--in my experience, anyway--is entirely predicated on the assumption, "It shouldn't take you that long." Or at least, "This should be a no-brainer." As a programmer, either of those phrases should set off the equivalent of Yellow Alert in your brain. I hesitate to put any hard numbers to it, but, after the last week, my gut's version of Algebra suggests taking your default "sandbag factor" and applying it every time you hear one of the above phrases (or a variation thereof).

In other words, if you usually sandbag by 25%--i.e. have a sandbag factor of 1.25--then you take your original time estimate and multiply it by 1.25 every single time someone informs you that "...all you need to do is..." or "...So-and-so's already done 99% of the work for you..." or "...all you and QA need to worry about is..."

All such key-phrases? Horse-hockey.

All of them.

Why? Because of a simple logical inference that should be--but mostly isn't--burned into any programmer's synapses from the get-go: Legacy Code => Constraints. In purely Algebraic terms, the correlation between the two is linear--if you're lucky. To wit: Remember how they taught you in Health Class that when you sleep with a person, you're sleeping with all their former partners and the partners of those partners all the way back to Adam and Eve? Yeah. Well, include-files (and the implicit--and often unfathomable--design decisions they implement) are basically All That And a Bag of Chips(TM @garyvee), too. Except that, with code, abstinence isn't an option: 'Nuff said.

Now, I'm not saying that there isn't a time and a place for code re-use. Actually, make that "times" and "places." I'm merely decrying the eternal optimism that such reuse embodies--even in people whose experience should be telling them--at mega-decibel volumes--that the statistical liklihood of it being "just that simple" is on par with lightning strikes, shark attacks, penny-stock billionaires and the like. Thus, arm yourself accordingly. With sandbagging, documentation (of every--cough--"unforeseen" complication), the appropriate "I told you so"s and so forth.

Look. I adore the office Alpha-Geek, and would trust him (and the majority of my co-workers) with my cat, houseplants, etc. But there are limits, and the afore-mentioned borrowed expectations of already-written-and-tested code numbers among them. There's just something about that that confounds even the left-most of left-brainers. And it's best that aspiring programmers (as well as those who work with them) recognize that. And react accordingly. And, whenever possible, don't wait to merely react; bake it into Standard Operating Prodedure.

Friday, January 7, 2011

Frivolous Friday, 01.07.2011: Disrupting technology

Not "disruptive technology," mind you. I would have titled this "The Seven Habits of Highly Effective PITAs," 'cept I'm sure that sort of thing's been done to death. But here are seven nearly fail-safe ways of making sure technology doesn't happen according to plan. On behalf of my fellow programmers, pretty-please-with-fair-trade-organic-chocolate-sauce-on-top don't use any of them. Ever. Thank you.

1.) Be the Baby Duck. "Imprint" on the programmer who wrote the code. Never mind that absolutely no one else is experiencing the problem--even with your login. Never mind that you have eminently more qualified desktop/network support personnel at your disposal. Obviously, the problem is in the code, not your computer--and don't let that lazy code-monkey tell you otherwise!

2.) Bug the Debugger. In other words, when someone's devoting their whole attention to troubleshooting your problem, carpet-bomb their inbox/IM/voice-mail with fresh "findings," suggestions, or--failing those--drama. Bonus points if you work in the same building and can stand over their shoulder.

3.) Freak out when the program does precisely what you specified...yesterday. C'mon...you can't possibly be expected to remember what you told the programmer to make it do a whole day ago, for pity's sake! (See also #2, above.)

4.) Use technology to maximize your potential. For passive-aggressive behavior, that is. BCC the programmer's boss on all "bugs," particularly in scenarios 1-3, above. Use the bug tracking software to punish programmers who don't take your drama seriously enough, and circumvent it for those who do. When sending meeting invitations, trust the popularity barometer you honed in middle school.

5.) Save the biggest decisions for last. Obviously, that's not always realistic. Sometimes decisions accidentally make themselves. In those cases, whatever else you do, do not share them until it's absolutely unavoidable. Otherwise, how could you possibly expect to inject any sense of urgency into the schedule?

6.) Always remember that "Consistency is the hob-goblin of little minds." It's true. Some Famous Dead White Guy(tm) whose stuff you had to read in college totally said that. Which gives you carte blanche for changing your mind in the middle of filling out the spreadsheet that'll be imported into the database. 'Cuz all data is created equal, right?

7.) You only have time for real-time. To wit: The only time for paying attention to new features, enhancements, fixes, etc., is after they've been deployed into the cloud, burned to a gazillion CDs, blessed by the App Store, yadayadayada. Not in the proof-of-concept stage. Not in QA. Not in Beta. Oh, nononononononono... After all, you were changing the specifications right up until D-Day, so why would you ever--in your right mind, even--waste your time reviewing anything before then?

- - - - -

Credits: Big ups to Dennis for #7. (Now scampering off to hang my head in shame for not thinking of that one right off the bat.)

Tuesday, January 4, 2011

Rough edges and revolution

Silicon Prairie News posted footage from Clay Shirky's "Institutions vs. Collabloration" TED talk. I thought it was a little weird that a 2005 talk would be considered 2011 news, but I watched it. Mainly because Cognitive Surplus was easily the best book I read in 2010--no contest.

A highly worthwhile 20 minutes, no question. But the takeaway for me was this line: "If it's really a revolution, it doesn't take us from Point A to Point B. It takes us from Point A to chaos." Which is precisely what makes revolutions inherently terrifying to all but the preternaturally nimble-footed and natural-born adrenaline junkies. That's why historians love them--the revolutions themselves as well as their navigators (and cheekiest stowaways). All from a safe distance, naturally.

The problem, though, is we're used to analyzing revolutions in Cliff Notes format. Connections are oversimplified; correlation too closely resembles causation. Even in the best history. In the worst, historicism itself is replaced by the cult of personality. As the internet revolution creeps to toward the end of its second decade, you see the "biography" of the revolution unfolding through its cast of characters:

* Mark Andreeson taking credit for Netscape
* Bill Gates issuing the "Pearl Harbor Day" memo that amounted to assisted suicide for Netscape
* Jeff Bezos and Tony Hsieh re-inventing retailing
* Brin & Page founding an unlikely search empire on the premise "Don't be Evil"
* Steve Jobs' (now-predictable) "...one more thing..."
* Ashton Kucher racing CNN to a million Twitter followers
* The Social Network: 'nuff said.

And, naturally, the graveyard is full of sock-puppets, buzzwords, dancing hamsters, Dow 50,000 predictions, much sticky residue from the IPO effervescence, Rick-rolled videos, Friendster, Dean-screams and so much more. That's the story we'll tell our kids, assuming we can unplug them from whatever gadgets connect them to their tribes and alternate lives. The revolution will have a tidy narrative.

In political revolutions, backing the winning side (multiple times if necessary) is the key to survival. Just ask Messieur Tallyrand. In revolutions powered by economics (e.g. the industrial revolution) or technology (e.g. the printing press, to use Shirky's example), backing the wrong platform doesn't seem to be quite the mistake that mistaking its context is. One solution never fits an entire family of problems. The strictest law of any change is the Law of Unintended Consequences. That sort of thing.

I'm not knocking knowing, even mastering a given technology or platform--far from it. The more (and longer) you have to learn, the easier it should become to unlearn and move on. Understanding context even as it shifts is (as we say in programming) "platform independent," but not bothering to understand the platform(s) du jour is too much like letting the mob sweep you into its madness. Mobs typically have a single goal, typically miopic, and rather often sidetracked in the end. Those who incite them don't always come to fairytale endings, largely a result of the same short-sightedness.

But historical metaphors aside, the main thing to fear in any technological revolution is the tendency toward incrementalization, optimization, of doing more of the same with fewer resources. There's a time and place for that, but not when you can see people struggling to file the rough, freaky angles off new-ish forms and pound them into old pidgeonhole concepts. Witness Google's spats with China and the Wall Street Journal, Matthew Drudge and/or Julian Assange vs. mainstream media, Comcast vs. Netflix, Tivo vs. advertisers, etc.

Freaky edges, after all, are the instructive parts, the DNA mutations that or may not live long enough to make a new species, the street-corner prophets. And we are trained from kindergarten--perhaps even before--to shut our ears and eyes to them. But unless you have a failsafe plan to smuggle yourself and the family jewels to a safe country, that's no way to survive any revolution.