I'm not quite understanding the practical value of "cloud" (ie internet) based computing

After seeing many references to how the future of computing is moving to the “cloud” based computing (ie Internet based computing) I wikied it up and all it means (essentially) is that instead of programs and data being resident on your hard drive they will be pulled from and saved to the net. So now the “hard drive” is at the other end of the network vs being on your hard drive.

This is supposed to be a big deal for consumers re cost saving in that you can charge someone by the minute for using a net based word processor or spreadsheet or other application vs spending a few hundred for a MS Office suite. I’m really not seeing the huge benefit in this. How much is the cost of a suite over 2-4 years of use. It’s minuscule.

The main negative I can see is that all this is based on net connectivity so if the net is down you are SOL getting any work done. What is performance going to be like if you have a pokey net connection? The overall cost savings and benefits just don’t seem all that compelling.

I asked this exact question a few weeks back.

Long story short? The benefits are dubious at best, IMO.

I think there’s more to it than just moving the location of apps and storage. For example, in a cloud model, as a software vendor, you divvy your app into functional parts and sell each one to users. So if I have 100 employees who need a word processor, but only 10 who need a particular, specialized function, I only have to buy that function for those 10.

It also makes upgrades and patches much easier to implement. Instead of making changes to 100 computers, I now don’t have to make ANY changes to any computer - ever.

It also makes collaboration easier. For example, I use google docs. If I want to get input from someone on a document, spreadsheet or whatever, all I have to do is make it public (I think there are ways of limiting access so it’s not truly “public”).

Then there is platform independence - the same reason that Java became so popular. If I can get my Commodore 64 on the net, I can read and write the same file types as anyone else.

I’m sure there are other reasons but that’s all that come to mind at the moment.

Thanks, that thread addresses most of my questions. It is odd how people pushing cloud computing keep talking about not needing “powerful” machines with lots of storage space when both those precise things are dirt cheap these days. The automatic backup and plug in - plug out functionality for businesses is the most compelling (to me) thing. Plus I’m guessing there would be no viruses and your system would never be slowed down by a bloated registry or similar woes common to stand alone machines assuming your loading a fresh net based OS each time. I’m still wondering who real world perfromance is going to work out if all your data is at the other end of a network connection.

I work on a system that has widely variable batch processing requirements. It’s economical, in this specific case, to spin up a few dozen EC2 instances overnight when they’re cheap, process a bunch of crap in parallel, and then upload the results to S3 for later retrieval. This method is cheaper than buying the maximum amount of required hardware and having it sit idle in a data-center most of the time.

But it’s certainly not the right solution for every application. Different tools for different things.

The big one for me is having access to my data anywhere, anytime (provided the internet is working). I do a lot of work from home, from the office, from a friend’s place, wherever, and having access to whatever I need is very valuable. That includes programs, too - having access to Google docs (even if I think it’s mildly retarded next to Excel) without having to worry about which version I have is worth a lot.

Mind you, I also do a lot of work that would be terrible to do on a cloud - large files, lots of reads/writes, that sort of thing. Honestly, like any other new tool, I think people got way too excited and thought it was the solution to everything, when it’s just one tool out of many. Fantastic for some things, glad it’s around, but it’s not going to completely take over the world the way some people predicted.

Today we also have what is known as the ‘private cloud’. In a traditional corporate datacentre you know exactly what task is running on each physical machine. With a private cloud the company still keeps all the harware in its own datacentre but everything is virtualized in a cloud. So for example during product development more resources can be allocated to the engineering department. Then during the big financial number crunching that comes at year end you shift resources from engineering to finance. It’s all about flexibility and cost savings on hardware and power.

Re “number crunching” is this really something that company wide IT assets have to be re-prioritized for? I was under the impression that CPU horsepower is so potent and cheap these days that a single beefy, well outfitted server could handle just about any mid sized company’s enterprise level EOY crunching without having to rely on distributed resources.

That’s not necessarily true. Companies use computers for things other than generating their quarterly financials.

Much is being made these days of “virtualization”. Instead of having 5 physical servers, each running seprate applicaions or databases and running at an average 15% capacity, you can have one or two physical servers running at 75-90% capacity with each application residing on a “virtual” machine.

CPU horsepower is cheap and powerful, but it still costs money to build, run and maintain servers. And it adds up for a company of 5000+ people servering hundreds of thousands or millions of customers. The concept is to reduce costs by reducing the amount of hardware by maximizing its useage.

I’ve wondered what is the bandwidth costs of cloud computing?

There’s been a lot of media reports claiming the internet is rapidly approaching saturation. The demands seem never satisfied.

Up till now the office worker creating a spreadsheet didn’t even need a internet connection. Although nearly every pc is networked these days.

Just wait untilsomebody creates a Rainmaker Virus.

Then, cloud computing will lose its corporate advocates…fast!!!

Add the cost of installing the software suite, updating it, patching it. Also consider not having to deal with unexpected costs of data destruction from MS Excel or Word macro viruses.

As an example from 1990s, there was software called CheckFree to do electronic bill payment. You, the user had to actually physically install this software from floppy disks. You also had to have a dial up modem. From time to time, you updated this software with a new version. For the consumers that were technically savvy enough to endure this hassle, they got the benefit of not writing paper checks and automatic computer records of their financial activities. Nowadays, you don’t need this complicated setup because most banks offer “online bill payment” to any customer that has a web browser – no complicated software installs! This is a form of “cloud computing.”

The way many cloud-based apps address this is to offer an “offline” or “local cache” mode. When connectivity resumes, documents & data are synchronized.

Probably what we’ll end up with for many scenarios is a “hybrid” computing. A centralized “cloud” with a local “offline” maximizing the strengths of each.

The initial hardware outlay might be cheap but the ongoing maintenance (and software license) costs are not. For many companies, the maintenance costs could be 3x to 10x the cost of the actual computer.

Actually, “cloud computing” has already taken hold in many forms but it doesn’t look that way because technology has evolved previously without having a fancy buzzword (“cloud”) attached to it. In the mid 1990s (before the internet), one could buy an encyclopedia on CD (such as Microsoft’s Encarta) for $50. Now you can access encyclopedia articles from Wikipedia. That’s a form of cloud computing. Microsoft also had another CD package with info about movies and TV shows. That’s been taken over by imdb.com. Several vendors also had CD-ROM packages of digitized maps, or complete USA phone & zipcode listings. Now you can get all that from mapquest.com or maps.google.com.

Now imagine the size of computer you would need to process all the terabytes of Wikipedia or petabytes of maps.google.com. There is no computer or hard drive big enough you can order from Dell or Apple that has the horsepower to do it.

It will be ok for most folks. Most people don’t work on huge 1000-page legal documents. They don’t manipulate spreadsheets with 1 million rows and 1 thousand columns. A typical use-case would be writing a 2 page letter for a job or child writing a 10-page book report for school. A lot of consumers use computers in such a limited way that it could be handled by the modest cpu of an iPhone. The internet network connectivity can easily handle these scenarios.

Certainly, there are some user scenarios where accessing remote data is not optimal. If you’re a graphics designer editing huge 100 gigabyte photo or video files, the performance would be unacceptable. An engineer working on huge architectural drawings (airplanes, skyscrapers) can’t work like this either.

It’s like electricity. For 99% of us, it’s good enough to get our electrical needs from a centralized power utility station (“cloud electricity”). However, there’s still that 1% that need a specialized dedicated power plant (a critical military complex, or factory that burns its own coal.).

(On the other hand, some of us are going “green” and installing solar panels on our roofs to enable us to decouple from the power grid. But many us don’t have the roof square footage or land acreage for windmills so we have to supplement our onsite electrical generation with the centralized power grid. This is similar to the hybrid situation with software.)

My main experience is as a non-corporate user working with family located across the country on a mutual project. Using “the cloud” we each able to view progress in real time and make our contributions to the effort in real time.

In the corporate setting my experience is perhaps shy of the true cloud experience: it is corporate server that we access and document to patient medical records whether it is from our office exam room thin clients, when the patient shows up at our after hours clinic at another location, or home pcs, or laptops … or iPhones in the near future. From my end user POV this is the same as a cloud, even though my corporation maintains the servers and has the same user advantages. The data is more secure stored centrally, not on my home machine or laptop, where I can use it but only accessible with proper passwords to both get into the Citrix environment and onto the program itself. On call this is huge, not only refreshing my memory of what we covered at the last visit, but what my partner saw when she saw the family, and what happened when they went to the specialist in another office but still part of the our same larger group. The central storage can also do a better job with data back up and having redundancies built in than I could or at least would as a user at my office site.

Of course the systems still go down, both for scheduled work and for crashes. And it is annoying to have to go back to paper for an hour or so and implementing a local back up with at least the problem lists and last visit of each patient stored on site has not yet happened although it has been discussed.

As far as bandwidth demands - I would think that the home videos of cats playing the piano use a lot more. My impression is that most non-video data doesn’t require all that much.

In theory those would be ok if the app is running remotely and just sending screen images/update instructions to the client.

It depends what kind of company it is. I work for a company that deals with digital map data. One of our very common tasks is to convert the data from one format into another. This is a very CPU-intensive process, and we have to do it for an enormous amount of data. Often for entire continents worth of road map data, sometimes for the whole world. That’s millions of roads. We are always trying to find faster ways of doing this – more efficient algorithms as well as faster computers.

Cloud Computing is actually sort of taking off, but only in a realm where it’s somewhat useful.

For Facebook, MySpace, Google, etc. you can build a little web application that people can use. That application (be it a game, calculator, calendar, or whatever) has to be run by a server somewhere, but if you want to set one up for yourself, you have to pay for a full server, someone to keep it running, etc. Instead, you can host the source code with Amazon or another cloud computing service, and they charge you by CPU usage rather than by machine or anything. If your app doesn’t get a lot of use, you pay nothing, and if it gets tons, then you have to pay rent.