When the buzzword-term “cloud computing” (and its siblings such as “in the cloud”) first came onto the scene, I was under the impression that it was about handing off intensive computation tasks to a large array of computers and have them gang up on the project, instead of subjecting your local computer to the load. Kind of like connecting to a mainframe back in the old days.
You know, like getting the average temperature each minute for each of 10, 000 reporting stations for a year’s worth of data and computing the diff of each from the average and computing the variance and mapping the hot and cool spots and construct a motion algorithm to explain how the hot and cool spots move over time.
But as actually used, it seems to be no more than online storage of documents and (occasionally) the use of a hosted computer’s version of some common applications (Word and Excel for instance) that you can use in lieu of creating and maintaining files with your own local programs. (e.g., Google Docs).
Is that an accurate impression? Is there a lot of “cloud computing” going on that is other than “store your files here instead of on your own hard drive” + Google Docs?