Cloud Computing Misery

Cloud computing refers to the accessing of non-local IT resources. Wikipedia article.

Typically these resources are hosted by another company, possibly in far distant lands and accessed via the internet. Obvious benefits are reduced cost due to customer pooling, no need to keep a staff of engineers on site to support your own infrastructure.

But disadvantages that I have seen - it’s very s l o w. We’re experimenting with google apps (replaces Word, Excel, possibly others), corporate gmail accounts instead of maintaining our own exchange servers, and PlanView to replace MS Project. The only thing I can imagine that would make MS project worse is making a web based version of it.

I just love sitting there and watching that little hour glass spinning around after I enter a simple calculation into a google spread sheet and hit save. And people are already complaining enough when they see that little “please wait” bubble pop up after resorting the columns in their Outlook in box. Just wait until hundreds of people start placing “network latency” calls to the helpdesk because PlanView “seems to be a bit slow today”.

How much does your place of business rely on cloud computing?

Not much at all, but I do work in the nexus of distributed computing and AI, so it’s an interest of mine. In fact, part of that work is to make “my” infrastructure more “intelligent”, making decisions on the fly about available computing speed, power draw, memory, etc. to relocate software components to improve performance. In fact, even the notion of “performance” is subject to on the fly decision making. :slight_smile:

Personally, while admitting a decent degree of ignorance about what’s commonly available, my impression is that much cloud computing software overlooks a glaring deficiency – one that is easy to see in the abstract but is incredibly difficult to resolve practically. Namely, creating robust and (fairly) substantial clients. I mean, ferfuckssake, it’s not like the world is moving to dumb terminals (contrary to how software as a service vendors might like it); at this point in time, almost any device at the user endpoint will still have a fair amount of processing power. What’s needed is sort of a “chubby” (as opposed to “thin” or “fat”) client/server architecture. Like email, for instance. Something for which Java seems pretty well-suited.

But much more work needs to be put into the foundational design of these apps. It’s not clear (to me, anyway) that there’s an abstraction (or even set of abstractions) that can be applied in a general way. And per-application design work is time-consuming, expensive, and error prone. In addition, there needs to be a much larger concentration on secure computing. After all, in minute-to-minute (or second-to-second) usage, it’s the delay that’s unbearable, but as a long-term consideration I would never trust another party with exclusive control of my data (through its accessibility). I don’t even feel comfortable with sharing (through duplicate storage) much of my data, much less have confidence that it’s secure from third parties.

IMHO, pervasive cloud computing will happen eventually. Doesn’t make the transition any less painful, though.