Wired on Friday: There's a dance that's been going on in software ever since the first terminal was connected to the first mainframe computer: it's the client-server tango, and its movements centre around the question of who, exactly, should be taking the lead.
In a pairing of two machines over a network, should the powerful remote machine do the lion's share of processing, with your local computer taking the easy role of displaying the results? Or should the remote machine fade into the background, handling only a limited set of features, like sending simple pages, or remotely saving files, leaving your powerful home computer to do all the donkey work?
For decades, the trend has been for the hard work to be done by the remote computer on the far end of the net. That was certainly the case when college students would gather around a teleprinter to see what decisions the mighty IBM mainframe had made. Then, in the 1980s, the burden shifted to home: it was the machine in front of you on the desk that ran the core of the software.
Since then, there's been a tug of war with no clear winner. One prominent company will declare the birth of the simple home "network appliance" enslaved to an all-powerful server, just as other companies were dismissing the need for any remote computer at all.
That's how it was at the beginning of the web boom. Prepackaged software sellers Microsoft were blindly ignoring the market for powerful internet server software, just as Oracle was ambitiously announcing the death of the overpowered home computer, and declaring that in the future, we'd all have the simplest, cheapest diskless $100 (€82) boxes at home, with all our documents stored and edited on powerful Oracle megaservers humming elsewhere in some collective data warehouse. Word processors would run on the remote machine, and your PC would do nothing but menially relay each letter you typed on its keyboard to a master machine on the other side of the net.
In the end, we landed somewhere in between. Our home computers are powerful, and shoulder a surprising amount of the workload involved in handling the net. For instance, the program you use to browse a web page takes around two million lines of code. The remote program running on the superpowerful server that dispatches that page to you is only one- tenth of that size.
Then again, no one imagines that if those large remote servers disappeared tomorrow, our home computers would have anything like the power and functionality they do now.
Yet now we're once again in an awkward transitional phase, where nobody seems to be very confident about what should run where.
Take the latest trend in fashionable websites: reimplementations of existing desktop software only run in your web browser and are controlled from a remote server. Visit a site like http://www.writely.com/ or http://www.ajaxwrite.com/ and you'll see a web page that seems the spitting image of a blank Microsoft Word document. Sure enough, in both cases, it's a word processing application running not on Windows, but on the web.
With Writely, you can type into that document on any machine, from anywhere. Documents are saved on the remote server and much of the clever functions of Word, including spell checking and tracking changes, runs on this far-off machine.
You can even collaboratively work on documents together with other members of the site (the site is currently closed to new registrations, following Writely's purchase by Google). AjaxWrite is a little more home-loving - you save documents locally, and most of the hard work is done by the browser, not the remote server. If anything, it's like having the Word application sitting on a faraway server that you run on your local machine. The bonus is you don't need to worry about buying upgrades or installing security fixes. Nonetheless, both are still far more dependent on a third-party computer than the self-contained desktop applications you buy from Microsoft.
At the other extreme lies another web start-up, Webaroo. Webaroo takes all the content on the web and then throws away the remote servers. It sells a desktop application that can search and display prepackaged "web packs" - archived collections of thousands of websites that you download in one go before switching off your connection.
You don't have to be connected to the internet to browse these subsets of the web's content, because they're stored locally - which means you can access a limited web experience, even when you're not online. It's the internet unplugged - in theory.
There's something a little gimmicky about both these extremes. You can't help but be impressed with the technological prowess of the coders behind the two in-browser word processors. It's truly amazing that you can split the duties of a large application so smoothly between a remote and a local machine.
Nonetheless, it's clear that desktop applications still have the lead in speed, responsiveness and features. Webaroo, too, is cleverness in search of an application. It's increasingly unlikely that avid web surfers will be offline in these days of 3G and Wi-Fi, and for the general net user with her random search requests, a small frozen fraction of the web isn't good enough.
Still, even if all-remote or all-local services aren't waves of the future, they're sure to find their niche. And perhaps henceforth, rather than debate about where our applications should run, we'll spend more time neither knowing nor caring.
After all, it's not who is taking the lead in the client-server relationship that matters, it's how well they dance together.
Danny O'Brien is activism co-ordinator, Electronic Frontier Foundation