Add the cost of installing the software suite, updating it, patching it. Also consider not having to deal with unexpected costs of data destruction from MS Excel or Word macro viruses.
As an example from 1990s, there was software called CheckFree to do electronic bill payment. You, the user had to actually physically install this software from floppy disks. You also had to have a dial up modem. From time to time, you updated this software with a new version. For the consumers that were technically savvy enough to endure this hassle, they got the benefit of not writing paper checks and automatic computer records of their financial activities. Nowadays, you don’t need this complicated setup because most banks offer “online bill payment” to any customer that has a web browser – no complicated software installs! This is a form of “cloud computing.”
The way many cloud-based apps address this is to offer an “offline” or “local cache” mode. When connectivity resumes, documents & data are synchronized.
Probably what we’ll end up with for many scenarios is a “hybrid” computing. A centralized “cloud” with a local “offline” maximizing the strengths of each.
The initial hardware outlay might be cheap but the ongoing maintenance (and software license) costs are not. For many companies, the maintenance costs could be 3x to 10x the cost of the actual computer.
Actually, “cloud computing” has already taken hold in many forms but it doesn’t look that way because technology has evolved previously without having a fancy buzzword (“cloud”) attached to it. In the mid 1990s (before the internet), one could buy an encyclopedia on CD (such as Microsoft’s Encarta) for $50. Now you can access encyclopedia articles from Wikipedia. That’s a form of cloud computing. Microsoft also had another CD package with info about movies and TV shows. That’s been taken over by imdb.com. Several vendors also had CD-ROM packages of digitized maps, or complete USA phone & zipcode listings. Now you can get all that from mapquest.com or maps.google.com.
Now imagine the size of computer you would need to process all the terabytes of Wikipedia or petabytes of maps.google.com. There is no computer or hard drive big enough you can order from Dell or Apple that has the horsepower to do it.
It will be ok for most folks. Most people don’t work on huge 1000-page legal documents. They don’t manipulate spreadsheets with 1 million rows and 1 thousand columns. A typical use-case would be writing a 2 page letter for a job or child writing a 10-page book report for school. A lot of consumers use computers in such a limited way that it could be handled by the modest cpu of an iPhone. The internet network connectivity can easily handle these scenarios.
Certainly, there are some user scenarios where accessing remote data is not optimal. If you’re a graphics designer editing huge 100 gigabyte photo or video files, the performance would be unacceptable. An engineer working on huge architectural drawings (airplanes, skyscrapers) can’t work like this either.
It’s like electricity. For 99% of us, it’s good enough to get our electrical needs from a centralized power utility station (“cloud electricity”). However, there’s still that 1% that need a specialized dedicated power plant (a critical military complex, or factory that burns its own coal.).
(On the other hand, some of us are going “green” and installing solar panels on our roofs to enable us to decouple from the power grid. But many us don’t have the roof square footage or land acreage for windmills so we have to supplement our onsite electrical generation with the centralized power grid. This is similar to the hybrid situation with software.)