INNOVATION TALK:FOR keen students of computer history, the name Vannevar Bush holds a special place in the pantheon of visionaries. Back in the early 1930s, the American inventor conceived of the Memex machine, a forerunner of the modern-day computer. It consisted of a combination of microfilm rolls, screens, keyboards and cameras that would augment human memory. Bush was way ahead of his time, and his vision was nothing more than a far-fetched idea for decades.
What is less often recalled is that Bush envisioned the Memex with a pair of touchscreens – it would be a tactile complement to the mind. Those engineers at Stanford and Xerox Parc who brought Bush’s dream closer to reality in the 1960s and 1970s invented the ingenious graphical user interface with an onscreen cursor controlled by a mouse – designed to compensate for the fact that we couldn’t manipulate the onscreen objects with our fingers. It was a paradigm that went essentially unchallenged until the iPhone arrived in 2007, followed by the iPad in 2010, finally bringing Bush’s tactile vision mainstream.
For future students of computer history, the dawn of the tablet will go down as pivotal – it’s not often that you get a veritable Petri dish in which to observe new computing conventions establish themselves. The past few weeks might be seen as a tipping point, with varying approaches set to battle it out in the marketplace – Apple, Microsoft, Google and Amazon are all offering new or updated tablets.
But the innovation is about far more than just being able to use our fingers as a pointing device. Along with the graphical user interface came a whole host of abstractions that we’re now well used to – applications running in “windows” opening so-called “files” found in organisational “folders” arranged in hierarchies that can be accessed through file-management systems. And that’s before you even get to the menus.
For most people, this is what “computing” is all about, but all these conventions are normative rather than inherent. There are any number of ways of doing things – after all, typing into a command line interface used to be what “computing” was all about, too.
In his last keynote speech as Apple chief executive, at the WWDC conference in June last year, a frail Steve Jobs introduced Apple’s iCloud service and outlined his vision for computing. “A lot of us have been working for 10 years to get rid of the file system, so the user didn’t have to worry about it,” he explained. “When you try to teach somebody how to use the Mac . . . everything’s going along fine until you hit the file system and then the difficulty is staggering for most people.”
For Jobs, the post-PC era wasn’t just about introducing mobile touchscreen devices, it was about moving away from many of the conventions that defined the PC: the file system, the desktop, windows and so on. The iPad gave him the opportunity to do just that, and the result is a computing environment that has removed so much complexity that it can be comfortably mastered by an infant.
For all of us familiar with the old conventions, that simplicity introduces constraints – while using an iPad is delightful, it feels like we have less control than we’re used to. And it is exactly that feeling that Microsoft hopes to exploit with its latest OS, Windows 8. This offers a hybrid model, with a stylish touch-friendly environment existing in parallel with a traditional computing environment, with all those file systems and windows still intact. Microsoft maintains that this is a no-compromise solution, but I can’t help thinking it’s actually a massive hedge. It springs from an unwillingness to admit, or an inability to see, that those computing conventions we have become so accustomed to are not the only conventions that can be useful.
Just as some people happily use the command line interface in certain contexts, many of us will continue to use the conventional PC for certain tasks. But a lot of the time, computing is about browsing the web and sending messages to friends – tasks the vastly simpler post-PC tablet devices excel at. Microsoft is betting that most people have been conditioned to think that all those windows and menus and file-management hassles equate with “computing”, and will thus resist the new paradigm of simplicity – maybe they’re right, but I’m sceptical. Change is necessary; rethinking traditions is healthy. We are watching the development of new conventions that will determine how we use computers for the next few decades. And after that? Well, that’s when this century’s Vannevar Bush will stake his or her place in the history books.