Is "cloud computing" just a buzzword?

I’m not a techie at all, but going by my peripheral awareness of the thing, it sounds to me like “cloud computing” is just another name for “software as a service” or plain old using a network. Is cloud computing just a new buzzword for old ideas? Why do people seem to be getting so excited about it?

(Perhaps this is better suited for IMHO, sorry if it’s in the wrong place)

It’s also the idea of a ubiquitous, single computing experience from all of your devices, maybe even all devices in general.

It is kinda a mix of a number of different concepts right now. What Google means by cloud is not the same as what a lot of IT departments mean by cloud.

For a company like Google, it is a service based concept. Where you and/or your company owns few to no servers, all apps and documents are stored on the Google cloud. And you pay for what you use when you use it. Basically a PC is just the client you use to connect to your apps. In theory then the PC is replaceable by a phone, or a tablet, or whatever. And you aren’t wedded to Windows. You can use any OS. You still need an application to connect to the cloud apps. And right now that means a web browser. But Google would like to see that replaced by something much more lightweight. Their ideal replacement would be an optimized tool… and their browser, Chrome, is what they hope to turn into that tool.

For a lot of companies it is more of a concept of server farming. Where you have a whole bunch servers (usually virtualized). And when you need to connect to e-mail (for example) a form of load balancing takes place and you are shunted to whatever node has the most free capacity. Should that node become “full,” you may be shunted off to another, without you ever knowing. It also offers advantages in up-time, if you have 15 e-mail servers. You can lose any of them without a disruption in service. No more midnight service calls because you need to update an e-mail server, because now the e-mail server is every server… and no server in particular. Thus a Client <—> Cloud model rather than the traditional Client <—> Server model of computing.

Both concepts are similar, with the exception of who owns the “cloud.” And the corporate IT concept doesn’t quite embrace pulling Word and PowerPoint off of all the PCs to put them on the cloud quite yet.

As you grokked, neither the Google concept nor the corporate IT concept is precisely new. But a lot of newer technologies are maturing that make them more practical to implement. But you can’t sell “utility based computing,” that is so 1970s or even 1960s. And “software as a service” is very 1990s. It would be appropriate to consider it a evolution of these concepts rather than something entirely new.

The cynical view of an old programmer is that ‘cloud computing’ is a new name for doing it just like the old Mainframe systems, except that the cheap terminals are replaced by more expensive PC’s, and the mainframe CPU is replaced by servers on the network (some of which may actually BE mainframes!).

I am an It professional and this stuff comes and goes over time but I think it will stick this time. It already has started to in a big way. “Cloud computing” is just a buzzword that refers to a general idea (or several competing ideas) rather than just a technology but it is real. Most people already use some version of it. Internet based e-mail like gmail is cloud computing as are photo albums stored on social networks.

Cloud computing is just the ability to access data from independent hardware of your choice from most anywhere. Personalized media storage isn’t a big technological leap and already happened but it is getting better every day. I got an Amazon Ebook Kindle as a work gift today. It is pretty cool but I don’t really have that much use for it but the idea is good. Instead of buying paper books, you buy them virtually and then can read them on a single device or multiple devices that look similar to a book any time in the future instead of having a bookshelf.

That doesn’t mean that true PC’s and local servers are going away anytime in the near future however. The specialized needs of anyone above the most basic user are simply too much for the concept to handle right now and will probably always be. That is why most people don’t use things like Google docs as opposed to Microsoft Office if they can afford it and all companies have specialized needs that they need their own computers for. You will see more of a mix for shared data moving into a cloud to be accessed from anywhere but much of the same for any system that needs much customization.

In short, yes.

Cloud computing is more or less “software as a service”. You log on to your PC and you access an application. But instead of that application needing to be installed locally, it’s “in the cloud”. You don’t know where it’s located, and you don’t care. It’s just there when you need it.

Sort of like when you flip the switch, and your light turns on. You don’t know where the electricity you are using comes from; whether it’s from the hydroelectric dam upriver, or from some coal plant 3 states away. The power is “in the cloud”.

“Cloud computing” sounds nicer than “Assimilation into the collective.”

And yes, resistance is futile.

That’s only one specific type of cloud computing. The term in fact covers a more broad range of concepts. For example, Amazon’s EC2 (elastic compute cloud) service provides on-demand Linux VMs upon which you can run whatever software you like. (Commercial, free, or your own stuff.) Many people use this service to host web applications (some public, some not.) Others use it as a distributed computing platform for parallel processing of large data sets. Some enterprise customers use Amazon’s VPN service to replace all or part of their internal hardware collection with EC2 VMs.

Amazon’s platform also offers a lot of software-as-service type things as well, such as their S3 distributed storage service and their SQS messaging system.

I bring all this up because over the past couple years I’ve been heavily invested in Amazon’s platform, developing some really fun (if you’re a gigantic nerd) stuff.

Bingo. Timesharing with a much more complex environment.

Although not necessary for cloud computing, an important component to efficient cloud computing is a virtualized environment. Software like VMware is used to create virtual machines that are actually software packages. Several virtual machines can run on a single physical server. The advantages to virtualization is you can use software to dynamically reconfigure how your hardware is being used in response to changing loads on your system, greatly increasing hardware utilization. Without virtualization, you would have to either rebuild servers to meet needs, or add servers, or have special-purpose servers sitting around idle during off-peak periods.

Here is a white paper written by NIST(small Word file, 39K) on the topic. At this point there is no one universal description of cloud computing that everybody agrees on, but this paper is a very good primer for someone just starting to look at the topic. However, it is a year old so already obsolete. :slight_smile:

Great info, thanks!

That part is part of it too. Citrix makes apps that do the same thing. I should take back part of what I said about cloud computing being anti-customization. It usually is but it doesn’t have to be. I can sit here at my home computer and log in and turn my computer into the desktop into the controller for servers around the world from India to Australia. It is almost like remoting into another PC although some of it is so well done that my monitor just morphs into another computer that I don’t even need to know the location of or who managed it and backs it up but it is completely customizable and I can install anything I want on it or store anything on it.

The key about technology of this sort isn’t about the raw ideas. Cloud computing has been around since computers have in some form. It is about all the little details coming together to make it work for lots of different purposes and make it data accessible outside of specific hardware or locations. When you have smartphones that can access your e-mail from anywhere, load your personal files from anywhere, or control a mainframe in Taiwan, that is when you know it is starting to come together. That is happening in a big way now.

Yes. The main difference is that we now have a network infrastructure that allows just about anybody to access those servers. So it’s now practical for home users to use a network server for word processing, for example.

The previous comments are pretty spot-on (at least as much as is possible with the nebulous “cloud”). I’d just add that, from a business perspective, there are a whole lot of laws that make “cloud computing” a liability (geo-location of data, for instance, is a biggie in the EU). Having had the misfortune of wrangling this horse in a work environment, I can only say, “ugh - do you want to save money or be compliant? Your call, boss-man.”

The explanations here so far are superb. At my ISV we are poking into MSFT’s cloud offerings now & should have releaseable product in a few months.

This I'm not quite understanding the practical value of "cloud" (ie internet) based computing - Factual Questions - Straight Dope Message Board is a recent thread on topic with a link to an earlier thread as well. This thread so far is more about CC from the tech & corporate end. that thread approached it more from the consumer benefit end.

How is this different than the VAX/VMS that I used 30 years ago? Seriously, I mean–isn’t this capability pretty common?

I definitely hear this buzz word thrown around a lot and everyone is sure it’s here to stay this time, which surprises me. While I agree that the idea can definitely work in theory given the ubiquity of Internet connectivity, the human factor is still an issue few people seem to address this time around. Back in the client server days in the early 1990s I seem to recall the same great logic of nothing being stored on the local machine, which was great, until the server went down at an unexpected time affecting large groups in the company.

If I have a customer service problem now, I am usually routed to a guy in India who may not be able to help me solve it. And that is usually just for use of one program. Imagine if ALL my data was now inaccessible and the same guy from India wasn’t helping. Business would come to a standstill. Without knowing where my data is stored, I am also at the whim of whoever is storing it to hope and pray they do proper backups, have proper security both internal and external to make sure others aren’t viewing my data, etc. That’s a lot of risk.

I’ve also not heard the pricing model discussed in much detail. So I get charged for how much I use various programs? How is use defined? Right now, I have the luxury of opening a document in MS-Word, leaving for half a day of meetings, and coming back to it right where I left off. Sometimes I may have a document open the whole day but never actually type a word into it because of other work distractions. Is some guy now going to charge me for a day’s worth of MS-Word use when I didn’t really use it? If so, my lazy computer habits are going to cost my company a fortune and we will have to re-train ourselves to no longer multi-task, because the company is losing money every time we have a program open we aren’t using. I can only imagine what a billing nightmare this will become, and once again my good buddy in India will be on the other end of the line trying to discombobulate the bill among the hundreds of users who feel they were wildly overcharged.

I see that you agree with the point that t-bonham@scc.net and I are making.

I was just explaining the current state of the art, but there is nothing new under the sun.

The differences are:

  1. A distributed system across many small machines instead of one giant mainframe, which allows you to dynamically balance the load of on-demand VMs across the whole system,

  2. The interwebs means anybody can become a customer and use the resources, instead of just those with direct access to the mainframe, and

  3. VMS blows goats.

Hard drives are cheap. I have more than enough storage for all of my programs and documents.

Would I want a low-power cheap terminal that stores everything but it’s own firmware somewhere on the network? No. I want control of my own data and apps. But I’m something of a geek and would know the difference.

The “it just works” crowd will be the ones who drive this development. Most people won’t understand the difference between accessing a local operating system & hard drive or a browser, server based applications, and documents. I’ll be nerdily sneering at them. :slight_smile:

My company uses a thin client model, and we have more than our share of geeks, actually making computers. It is a great thing. Here are the advantages:

You can get to your files and session anywhere in the company. When I give a talk, I set up the presentation on my thin client, go to the meeting room, stick in my smart card, and there it is. You could go to Japan, stick in your smart card, and resume working as if you never left.

You might do backups, but lots of people don’t. If my work computer blows up, I grab another one and keep going with less disruption than if my cell died.

You can adjust your computing environment according to your needs. I have a window open for the regular server, but also several on a compute server where I can run compute intensive jobs. For biggies we also have a compute server ranch, with 10K cpus.

With the size of the network you have pretty much all the apps you’ll ever need, and can get more. If you read Computerworld, you know that plenty of people add apps to a more or less private machine hooked up to the network that are dangerous.

I used to have my own workstation, and it was cool to have root, but this is actually a lot easier in terms of getting real work done and being a junior deputy sysadmin.