App virtualization..is this as cool as it looks

I just downloaded a demo from www.xenocode.com

From the looks of it, it will convert a windows app to a single standalone executable that can be easily moved around/backed up/copied off and restored to a new windows install with ease. without all of the BS reinstallation.

Is this the case or am I missing the point?

Guess I could wait and see…the client is taking forever to index my machine so I can play with it…

We use it to test software at my job but its not like a 2nd PC. its REAL slow and its almost to the point that i have a 2nd PC just for testing purposes.

are you sure you are not talking about Virtual machines?

How come when I try to download the virtual Firefox, I just get a regular Firefox executable?

True…I was…hmm probably should have read the post first.

On that note i have used a few portable apps. Mainly Open Office and Firefox. Both work great and do not notice any speed slowness. I used the USB versions of those apps.

Thats just it, you get a single executable file, no installation needed.

With freeware apps like open office, AVG free, Spybot, etc, this would seem to be the dream of many IT people. Why would spybot etc not just come as a virtualized file rather than an installer package. Is it unique to each machine or something. This is part of what I am looking to understand.

This is similar to SoftGrid, an MS technology that delivers virtualised applications in a supposedly efficient manner.

Basically, when you launch the virtualised application, it creates an isolated virtual machine running a clean copy of Windows, then launches the app within that virtual machine. The App is sandboxed inside that virtual machine. The distributed image contains all the application files in a virtual file system, and so all application linked DLLs are wrapped in the virtual application. They use file system joining to get at the base Windows files from outside the VM, and probably an executing environment snapshot for rapid startup. There must also be some physical to VM communication channels to allow a higher level of integration.

The benefits are generally touted include Application isolation. This means that there are no DLL conflicts, the application does not require registered components, users need no additional rights to execute. Runtimes can be included in the virtual application, such as Java or .NET. This really does help when dealing with multiple apps with mutually exclusive java runtime requirements. It’s also pretty easy to package this sort of stuff, but profile and policy stuff must be bit tricky.

But, the app may run in a sandbox and have trouble communicating with related applications (cut and paste may work, but OLE embedding probably will not work).
Application Virtualisation will eat up disk space. A virtualised app will contain all the libraries and runtimes required by the app, and every virtualised app could contain all those same libraries. I am guessing that each app is built on a base standard XP OS install - so there may be no credit if your base OS image includes those libraries.

It’s all very cute, and I can see real applications for this on the server side, but I think it is all a bit much for desktop apps. But then again, I have spent months of my life trying to integrate apps with multiple Java runtimes into packaged environments - if it solves that problem, it may just get my vote.

Si

I don’t want to start a platform war, but isn’t this how applications work on Apple computers? - I’ve only recently acquired a Mac (and it’s an old one), but the applications I’ve installed on it consist of a single package that can be dragged and dropped to cleanly and completely install or uninstall.

Macs have better platform control :wink:

Mac apps used to be statically linked, I think. I am not sure if they are still the same. They also don’t have a registry in the same way Windows does. And the OS supplied features are more fixed and not as extendable as Windows.
You are not getting the total virtualisation offered by xenocode.

Shared libraries are great, but you need to be able to manage versioning issues. Microsoft did not manage this particularly well to start with, and we are now in a real mess. Linux handles it better, but it is still not perfect. Some people (in this period of cheap storage) advocate a return to static linking.

Si

Virtualization is going to be big in the next few years, at least in the corporate world. In ten years, most servers will be virtualized (the cost and usage efficiencies will make it very desirable) as will many desktops.

I just returned from the Citrix iForum in Las Vegas and saw things like laptop computers that are completely virtual – you connect to the main server via fast network card and run everything on a virtual machine, saving things on the virtual server. I saw people switching applications from one server to another without any of the users noticing a difference (thus, if you need to shut down a server to reboot, you can move the virtual servers away, reboot, and move them back and no one would be inconvenienced).

We already have Citrix at work, which, among many other things, allows Mac computers to run Windows programs. Virtualized applications show up on my start menu and run pretty much as though they were installed on my computer. And I use a virtual machine to test software – install it, play with it, then delete all settings.

In ten years, at least some of your computing will be done virtually.

I agree with this - but I suspect it will be virtualisation similar to this application virtualisation that really takes off - rather than virtualise entire systems, specific applications will be virtualised, isolated and shifted around as needed, but the kernel/OS will not be virtualised. And desktop virtualisation will have fewer applications from the normal user perspective.

This is where Virtual Machines are really useful, but having spent the last 3 months working on a large multi-server, multi-tier testing environment that is entirely virtualized, it does has issues…

Hmm, colour me skeptical, but I have been hearing the same promises about Citrix/Terminal Services for over … I don’t know, maybe … 10 years. I’ve implemented a thin client only solution. I’ve used Citrix many times. It’s been a part of many projects I have worked on.
But it never really catches on. It always seems to be sitting in as a niche solution for a few users or some specialised apps. But Citrix+VMware does have interesting possibilities.

Si

Like si_blakel says, many more Mac apps are statically linked, but it’s really a little deeper than that. Both Windows and Apples have very robust libraries for providing application functionality. On a very high level, this is .Net and/or Win32API for Windows, and Cocoa and Carbon for the Mac. On both platforms you can write a single executable that calls these libraries without needing an installer to do special things for you. Windows historically, though, has this interesting concept where every time you add a program, you add to the operating system. Install Internet Explorer, and you get all of these new system level DLL’s. Add Windows Media Player, get new system level DLL’s. Add DirectX or MS-Office, and, well, you know. Third party developers fall into this, too. Rather than build a monolithic executable, break it all into DLL’s. Now you have Windows DLL hell (although since XP, it’s not been as hellish). What’s worse, now that IE, DirectX, Office, etc., API’s are part of the operating system, third party applications that come to depend on them also distribute these DLL’s as part of their installers, just in case you’ve not already installed IE, DirectX, etc. Keep in mind that since XP, many of these libraries are included in the base OS. Mostly because of DLL’s have Windows installer packages been necessary. Of course, now that you have an installer package, you can do things like populate the registry during installation, which has some benefits. For example, you can identify .doc files as Word documents at installation, rather than waiting for the program to make that identification at first run. As a consquence of so many installers on Windows, you end up in a self-fullfilling mode, too – because an installer is expected, an installer is expected, even if it’s not needed.

Macs traditionally have been monolithic executables. Kind of. Pre-PPC all code was limited to 32KB CODE resources, and so it was swapped in and out all the time dynamically. If you wanted a library, you normally embedded it into the code. If you needed something more at the OS level, you wrote an extension that patched the OS to give you functionality (processor traps). Such programs were generally the only programs that needed installers on the Macs. The normal expectation was drag and drop install. Not until the arrival of the PPC Macs and System 7.5.3 (or was it 8.0+) was the concept of DLL’s added to the Mac OS. These were all kept in the (IIRC) Libraries folder of your system folder. Such programs also normally needed installers, but I seem to remember that the last Classic version of Office was drag-and-drop, and its first-run procedure would install these for you. Developers on the modern Mac still try to keep the tradition of the drag-and-drop install. Office still does this. Most “smaller” apps still do this. But many other complex apps still need to install libraries, kernel extensions, control panels, media in your Users’ folders, and so will use an installer.

The Mac thing:

Macs will (and have for years, dating back to pre-OSX days) allow you to share the applications you have installed on your machine so that another Mac user can run your software from their machine simply by mounting your drive, with the app, on their Desktop, and double-clicking that app. Similarly, you usually don’t have to “install” (or “uninstall” for that matter) your apps; just drag them from any volume that has them on 'em to your own hard disk and you’re good to go. Makes upgrading to a new computer very easy, just copy all your old apps from your old computer’s HD. There are exceptions, I’ll address at the end.

It works because:

a) No registry, as mentioned before;

b) Mac apps hardly ever use full-blown file paths to denote where to find other components. None of this C:\Documents and Settings\Users\AHunter3\ApplicationX\library.dll stuff. The old-world style apps (which include all pre-OSX apps but still applies to some modern OS X apps) may need to reference files outside of the application file itself, but they either do so by conjuring it up by file type code and file creator code (in which case it doesn’t matter where it’s at as long as it’s somewhere on the computer) or by relative path (the startup disk’s System Folder’s Preferences folder, wherever that may happen to be; or the MyProgram Extensions folder which should be in the same folder as the application file itself, wherever it may be). The more modern OS X “.app” package, on the other hand, is a folder masquerading as a file; to the Finder (and the end user) the executable application and all of its support files appear as a unitary object; wherever it goes all its luggage goes with it.

c) As you can surmise from these descriptions, Mac applications have tended to not rely on batches of library files that other programs also share the use of. Tended, I say, rather than “do not” (see exceptions below). This means our apps may take up more room and routines may be installed in multiple places to accomodate multiple apps that use those routines).

d) Exceptions: The old OS 9 (8, 7,) world had folders of files that had “INIT” or other libary code in them, the Extensions folder and the Control Panels folder of the System Folder. These were closely comparable to Windows .dll files. A great manu such files were Apple-installed (many apps would share their use but the apps’ installers didn’t have to put them there, they’d already be there) But many programs would in fact install their own battery of such files and often the apps would not run without them. You’d need to copy these as well as the application file or folder. They usually had long, decently self-explanatory names (“Norton Shared Lib” rather than FNUNSHL.dll) and more often than not got dumped into a subfolder clearly named for the app that had dumped them there, but any way you cut it, you needed 'em. As for OS X, such things are less common, but some programs need to install low-level kernel extensions or other code that needs to run in admin space rather than user space. For those we do have installers, (admin password required to install), and such apps may not work if you simply copy the app itself to another computer. Finally, a few tightwad software companies with intrusive attitudes have deliberately done things over the years to make it less possible to simply copy an app to a different computer in order for it to function there; their motivation is strictly piracy-reduction, user convenience be damned. <cough>Quark<cough>.

In short, yeah, we’ve always had something pretty close to app virtualization.

Not wanting to pick nits, but it isn’t application virtualisation. It is a lot of really good things (easy install, easy uninstall, etc).

Application virtualisation includes sandboxing the app from the host system, giving the app rights to the virtual system registry without compromising the host system, standardised configuration and controlled persistence of specific settings.

As well as single file execution :wink:

Si

Accurate correction. I should have written “we’ve always had something pretty close to the OP’s description of what app virtualization provides to Windows users”.

I assume app virtualization doesn’t sandbox the app from the entire host OS, though? If it could do that, you could pop a copy of Firefox.exe on your elderly Amiga and use it to come here and post! (Give or take hardware insufficiency). I assume you meant it insulates it from the need to reference shared libraries, registry entries, and the like, yes?

The current goal is to make desktop virtual applications indistinguishable from running the app on your computer. And it’s getting closer. Citrix still has issues, but in five years most will be dealt with.

Currently, yes. But they are being worked on and eliminated, since virtualization has such potential.

I was referring primarily to corporate servers, which will probably be virtual in the next ten years. Applications will be slower, but people will be using them.

In my business I am looking for ways to restore apps quickly to business customer machines. Since I see them regularly anyway, if its just a matter of showing up with a clean loaded XP laptop and running through all of their apps with a virtualization tool, I can easily see this being helpful to desktops.

For example, one of the services I offer businesses on service plans is loaner machines. If this works the way I forsee it, I could have a custom set of apps for each customer on a DVD or two and just drop them on the desktop of the loaners or a new machine and they have no downtime or additional charges while I sit and reload software.

Pretty much. I am guessing, because I haven’t had a chance to play with Softgrid or Xenocode, that the packaging environment wraps the app into a vanilla XP install, but strips out the standard XP files and registry (otherwise every app would be about a gig for the app plus XP). The VMs file system will use links out of the VM to all the files that have been stripped out. Similarly for much of the registry. Plus the VM is not a full boot system VM - it will be a partial lightweight isolated environment. So no running Office2007 on Linux with this.

Nice idea. The problem is that building your packages is your real problem - either custom MSIs or the VM style apps. I get paid good money to do that :wink:

Your packaging tools are pricy - £5000 per seat for Wise Packaging Enterprise for MSI. There are some cheaper options, but then they require more hard work at the packaging end. Testing is really important. If every client has (say) a different Zip tool, or the same tool but a different license, you have more work. And then there are updates. How are you going to update your locally virtualised Office install when another vulnerability floats around. It gets to be a lot of really hard (and often thankless) work. I would go with building an custom XP install (with driver packs and apps preinstalled) and use sysprep for each customer.

There are places to get the common open source apps as prepackaged MSI installs. Then a simple customisation script can set you up for each client. I am sure that virtual app appliances like the Firefox one will crop up as well.

Si

No, really, the site says it should be a self-contained Firefox with its own home page and settings, yet when I run it, it acts just like I’m running another instance of my own executable. The Acrobat/Brochure example is much cooler.

We use something similar called VMWare. As with the others mentioned, it uses a lot of space, is slow, and has limitations re: interacting with apps outside the VM. But it does have its uses (like setting up a complete development environment for an out-of-date software release in one fell swoop).