I think cloud computing shouldn't be overrated.
@unknownuser said:
Cloud computing is a model for on-demand network access to a shared pool of configurable computing resources (data storage, processing power).
So already NASA's SETI project is cloud computing although it uses private computers spread over the whole world instead of datacenters.
Also a VNC connection to a computing center running Windows Server is already cloud computing. Here, the apps run completely on the server and only the output is transferred to a thin client.
Most people understand mostly web applications as cloud computing and expect better performance. I think that illusion is only because web apps are "simple" and have less features. Web apps use platform-independent, uncompiled Javascript code which has in general far less performance than compiled code (which only runs on a certain hardware platform and a certain operating system).
As to the place where the application runs, I think that depends: Google Docs and GMail continue working even if you remove the network connection. In spreadsheets, you still have the full feature set of calculating, formatting, sorting and resizing cells. If you watch your CPU load, you'll see that such simple tasks in pure javascript are a bit more demanding than they would be in normal desktop applications. Ressource-intensive tasks (file format conversions, video processing, rendering etc.) and search requests are done by the server.
On the other side, if everything was done on server side, people would ask to use such apps also when they don't have a network connection available.
What I personally like is the platform-independence which allows to use any operating system you prefer, including operating systems that run equally on very old and new hardware. It's quite bizarre that we move towards slow, uncompiled code, but browsers continue improving performance (and modern browsers internally compile/optimize the code for the specific platform).