Tuesday, July 14, 2009

Moore's Law in the age of mobile computing

[For the benefit of non-technical readers: Moore's Law predicts, that the capacity of computer chips--the parts that do the "thinking" and short-term "remembering"--doubles approximately every two years.]

I was thinking about that today, and wondering whether CPU speed, in the main, will be so very critical in the coming decade as it has been in the last thirty years. Certainly, servers will continue to need every resource that can be thrown at them, particularly given the proliferation of virtual servers. Gamers probably won't complain of too-fast processors anytime soon, either. And you can never have too much memory, of course...

But as computing steps away from the office, replacing desktop with notebook, notebook with netbook, and netbook with phone/pda/entertainment chimeras, low power consumption may count for more than pure computing brawn.

Which leads straight to the question of batteries--their charge times, the number of times they can be recharged, the time it takes to refill them, and that sort of thing. After all, if we're relocating our "office" to the coffee shop or beach or what-have-you, why not make a day of it? The most bleeding edge CPU, after all, probably won't make a skinny latte's worth of difference.

Playing into the interval between recharges is of course solid-state memory breaking out of its original role for sneakernetting and cheap personal data backup. Solid state drives (SSDs) of course, don't carry the power-sucking overhead of moving parts, but it can take more time to find your data on flash memory than with a traditional hard drive. That its lag in total storage capacity will, IMO, be more of a concern than processing speed. Because, of course, our iPod Touch should have as much space as our classic iPod, yes?

Additionally, extra attention needs to be paid to the screens. Time was when a large CRT monitor could bogart nearly as much electricity as the 'fridge. That's changed quite a bit, but it's also another potential roadblock to mobile computing. Two cases in point: I like the brightness of the laptop I'm currently using, but it comes at the price of a two hour battery life. Similarly, the larger inner screen of my cellphone (think big, pretty, Christmas card postage stamp vs. the "Forever" ones you use to snail-mail the bills) will often drop a bar on the battery when I pop the phone open to use the QUERTY keyboard. Anecdotal, to be sure, but I think that most laptop owners would concur that slimming down monitors' power usage would likewise make them feel less tethered to an outlet.

I guess the bottom line is that the less time and attention we have to spend watching battery levels, the more natural mobile computing will become. Not to detract from AMD, Intel, etc., but pure number-crunching horsepower isn't central to that experience anymore. Of course, on the backend, it may well--and probably will--be. Naturally, we want those gadgets to connect to the internet and work with something like the responsiveness of a full-blown desktop. Why? Because that's what computers have trained us to expect.

Understand that I'm certainly not trivializing Moore's Law. For a trend to have held up forty-four years and counting is quite staggering, really, given how widely distributed and complex (and, sometimes, fickle) computer manufacturing has become since the 1960s. But my gut feeling is that its relevance is waning (in comparison to other considerations), even as the explosion of gadgetry has us taking our inner and interpersonal lives with us wherever we go.