Sunday, April 18, 2010

Hollywood programming languages

A co-worker and I were on the subject of movies, and I asked whether he'd seen the impressive Mongol. He hadn't, although the light of recognition immediately appeared in his face: "It's about Genghis Khan, right?" "Yep," I affirmed. "With John Wayne?" he asked, and--as I spluttered in horrified disbelief--proceeded to Google it to show that he wasn't making that up. Sure 'nuff: The Duke swaggers his way across the steppes to win the love of Susan Hayward (Princess Bortei).

I'm sorry to report that I kinda flipped out. Minus the "kinda" part. Such are the downsides of having a History degree...

Apparently, there are similar hazards in having a programming degree...even a two-year one. "The Vo-Tech--as it was called in Red Wing, MN--heavily emphasized the C and C++ languages in its curriculum (circa the late dot-com era, anyway). Since then, I haven't had much use for either C or C++...until now, when I find myself having to scramble up the learning curve of Objective-C, used on Macs & iProducts. And it honest-to-pete is like someone asked Hollywood to write the language, but was too cheap to retain Kernigan, Richie and Soustroup as consultants. Apart from all the stuff I disliked in C/C++ (pointers, memory management), there's been precious little that gives its creators any right to name the language after its alleged ancestor.

And, frankly, that sucks. Just like making a "historical" epic that could pass for history about as well as Donna Reed could pass for Sacajawea.Yes, there's a legitimate need for "telescoping" in drama--the Henry V prologue* And All That.

But.


To create a language that claims to be a child or sibling to another and then make it nearly unrecognizable to a student/master of the original language? That's Hollywood-style revisionism, and the harm it does is not limited to offending cranky middle-aged programmers such as your faithful blogger.

Family history--biological or linguistic--carries certain "baggage." In the case of biology/psychology, it has to do with health and neural wiring. In the case of language, it has to do with syntax and conventions. Java and PHP each bear a strong resemblance to C/C++--ironically, more so than Objective-C. But their designers had the decency to name them something else. There is, literally, no "c" in "Java" nor "PHP." Which makes them easier to learn in the sense that the C-savvy aren't so tempted to drag in all the assumptions of the language they know.

See, whenever you learn a new language (human or computer), there are actually two processes at work: Forming new assumptions about the underlying patterns and unlearning old assumptions. Insisting that Objective-C is really just a variant of (or successor to) C seriously trips up the "unlearning" part of learning. To my way of thinking at least, that's a problem, and one for which I think the language's designers should be taken to task.

- - -

*
... For 'tis your thoughts that now must deck our kings,
Carry them here and there; jumping o'er times,
Turning the accomplishment of many years
Into an hour-glass: For the which supply,
Admit me Chorus to this history;
Who, prologue-like, your humble patience pray,
Gently to hear, kindly to judge, our play.