From a development standpoint, cloud computing is a flexible and elastic computing environment. Sort of a server with unbound resources capable of running infinite programs for infinite users, with infinite processing power, memory, storage, and bandwidth.
Cloud computing is a dream for people like me who are developers at heart but spent much of their time as involuntary administrators, be it on Web servers, database servers, file servers, firewalls, networks, or whatever.
Of course everything good must come with something bad. Cloud computing is no exception. There is the lack of complete control over the resources, the risk of unforeseen outages and the risk of data leaks and data loss. And being at the mercy of the cloud vendors may not be good either. They could just shut you down by mistake or because of some bogus reason.
But one aspect I hadn't given much thought about is cost, specially the unexpected ones. Last weekend at work an unforeseen condition awoke a piece of bad code in our data access layer causing CPU thrashing on our database server. The systems still worked, albeit near a denial-of-service condition. By Monday we identified and patched the code and everything was back to normal. No cost was incurred as a result, other than perhaps some extra heat generated by the server.
If our systems were hosted on a cloud we might have been faced with a large invoice from the provider for the processing and bandwidth used. Or the provider might have just shut down our services to protect their other clients. The results could have been catastrophic.
When you operate your own servers, you may be able to get away with some mistakes. But in the cloud mistakes could cost you a lot, even your job. That risk is something we should be keenly aware of, as the computing landscape slowly shifts to the cloud. Like it or not, cloud computing will eventually usurp customized systems. It'll come with many benefits, but we must recognize that it won't be as easy as dumping in the applications and calling it a day.