In the midst of the constant change that is the IT industry, the idea of a stable, long-term cycle between centralization and decentralization is appealing. It might be comforting if something in our industry were as stable as the tides! Sure, we're all worked up about cloud computing now, but soon we'll be refocusing on what we can do locally with our next-gen smartphones, or perhaps our cybernetic implants. It's never a long hop from comforting myth to received wisdom or dogma, and by now many people view the cycle as a law of nature.
I think the received wisdom is wrong, and the apparent pendulum is about to stop swinging.
Sure, sometimes what looks like a pendulum is in fact a pendulum. But at least as often, if you're swinging back and forth between two alternatives, it's because neither one of them alone is meeting all your needs. When you finally find an alternative that does meet all your needs, you can stop swinging. In this case, technology is finally reaching the point where we can have our cake and eat it too. We want fast, powerful, rich applications that we control completely, but we want to do absolutely zero work to administer our machines and applications.
For applications that interact with you immediately and richly, nothing will ever beat a computer that's right near you, perhaps in your hand. But the average user has no patience or tolerance for maintenance activities -- storage, backup, and system administration and operation in general. Such tasks are far better accomplished by remote servers, professionally administered. Traditionally, application designers have had to trade off the speed and responsiveness of a local application against the reliable remote maintenance of a server-based application.
That tradeoff will soon be as obsolete as EBCDIC. Today, we have ultraportable machines that provide rich interfaces and media. We have near-ubiquitous high-speed Internet connectivity, almost anywhere you might want to go (with a few lamentable exceptions such as my own home, but that's a story for another day). And, to allow users (even corporate ones) to be largely oblivious to the problems and complexities of running reliable services, we have the emerging paradigm of Cloud Computing.
It's true -- mostly -- that no individual technology in Cloud Computing is fundamentally new. Networked services have been around since the 70's, and most modern applications were first demonstrated in the 80's as part of university projects like Andrew (Carnegie Mellon) and Athena (MIT). But as one of the participants (in the Andrew project), I can tell you that much of what we did was amazingly cool on campus, but completely impractical for the wider environment. Trust me, your iphone wouldn't seem nearly so cool if it had to do all its communication at 14.4 kilobaud. (Your Android wouldn't either, but at least you could distract yourself hacking the system code to try to speed it up.)
What's new -- what makes cloud computing far more than the latest buzzword -- is simply that all of the pieces have finally come together in mature form: powerful computing devices so cheap you can view them as nearly disposable. High speed connectivity so ubiquitious you can view it as nearly everywhere (if you don't visit me). And maturing cloud-based services so reliable that you can view them as nearly always available. When you put all of these together, you can begin to look at all your data processing needs in a radically different light.
Properly configured, a cloud-oriented user should never have to worry about backing up anything, ever again. He'll never need another disk or flash drive, because his data will always be available to him anywhere he goes. Similarly, a business that has moved its office functions to the cloud should never have to worry about system administration beyond the most localized activities, such as adding and deleting user accounts, or keeping the local network running. (Of course I'm not talking here about companies that write or operate complex systems; I'm talking about businesses using services that are not part their own mission or expertise.)
Did you drop your Droid into the swimming pool? Did your salesman leave his laptop somewhere in an aiport? In the cloud world, these become no more than minor nuisances -- you just replace the machine and reconnect to the cloud. Your data is intact, and you once again have at your fingertips more processing power than all the computers in the world back when astronauts walked on the moon. (This is literally true today for Droid users, but not most laptop users; cloud computing can make it so for the vast majority of business users as well.)
I love the new paradigm -- that's why I recently jumped ship from a cushy position at IBM to work for Mimecast, a cloud startup -- but it certainly has its downsides. Jobs for sysadmins in non-computer-oriented companies will dwindle, but this will be offset by jobs at cloud companies, and by jobs administering cloud services and taking ever-better business advantage of their capabilities. And the issues of privacy and security will, in any conceivable architecture, always demand constant vigilance from providers and thoughtful attention from users, administrators, managers, providers -- pretty much everyone.
Still, I think that the promised land is in sight. Most of us will find the cloud world simpler, more efficient, and more pleasant to use than what has come before. Once you've been in the cloud for a while, you'll never want to come back -- unless you've stumbled onto a bad service provider. That's probably the biggest danger in the next few years, as the cloud market sorts itself out. We can see the promised land from here, but if you want to get there safely, a trusted guide is still an awfully good idea.