If two computers are networked together and a person accesses a program installed on the remote machine, but not on the machine they’re using, which processor gets used in running the program? Is it the processor on the machine where the program is actually installed, or is it the remote machine’s processor? Or do both of them somehow share in the load? Do both the OS’s have to be the same version? Do they even have to be the same OS? (I.E. if one of the machines is a Linux machine, and the other an XP machine, could I log into the Linux machine andaccess Word on the XP machine?)
The PC that launches the application runs the program, no matter where it is stored. As a test to make sure I’m not insane, I went through network neighborhood to one of my other computers and ran Freecell from it. It showed up in my task manager processes. I then opened a VNC connection to the other computer just to make sure freecell wasn’t running there, and of course it wasn’t. I don’t know about the Linux program via a windows box, but I’d guess it wouldn’t work.
If we’re talking about windows peer networking, then the processor where the program is actually stored gets involved in transferring the contents of the file to the machine that is requesting it; the actual processing of program instructions is carried out on the machine running the program.
During the process of running, the processor where the program is stored may be asked for more data, but it won’t be asked to run that data.
But if we’re talking about a server and a client, or a terminal and a mainfram, that’s a whole different kettle of worms.
I’m assuming that you’re just talking about two computers linked together to use file/directory sharing rather than anything like RPC (Remote Procedure Calls).
If you’re on machine A, have linked to a directory on machine B, and then start up a program that resides on machine B, then your machine (A) will be the one running the program. Machine B is just acting as a disk resource.
The two OS’s don’t have to be the same in order to share files. However, the machine you’re running on does have to be compatible with the program residing on the second machine. IE, if you’re on a Linux machine that’s linked to the disk drive of a Windows 2000-based PC it could see - but not run, unless it contained a Windows emulator - the NOTEPAD.EXE program residing on the Win2K-based PC
Hmm, what do I need to set one machine up as a server? (Assuming that there’s no additional hardware requirements needed.)
Dunno; even if you were to install NT server on one of the machines, the programs stored on it would still run locally when the clients launched them. Server applications are generally written specifically to handle processing at the server level; database server applications, for example, accept submitted tasks and then work on them independently of the submitting client.
I think there are server applications that allow other programs to be run at the server, with the output being piped to the workstation (and the input from keyboard etc being piped back to the server). I think Citrix may be one of these, but I can’t say for sure.
I shall also be quite surprised if nobody says that linux can do this sort of stuff.
The problem with linux is that I’ve got some hardware and software for which there’s no linux version. I wouldn’t mind running one of the machines under linux, but I’d like to be able to access the hardware on the other machine. I’ve got a TV tuner card on my one machine along with a DVD drive, and just for the hell of it, I’d like to be able to remotely access those things from my other machine, however, it doesn’t have the processing power to do so.
I realize that this might be a total impossibility, but if it’s possible to do, I’d like to be able to do it.
(Oh, and I know that there is software out there to allow you to watch DVDs on a linux machine [tain’t legal last time I heard, so don’t link to it], but nothing for the TV tuner card I’ve got.)
Mangetout, you’re thinking of Terminal Services, aka Remote Desktop. On an NT server with Terminal Services installed*, you sit at your client machine and open a Terminal Services connection to the server. What you see is a window (which can be full-screen) that has a “session” where you get a full desktop, start menu, task manager, etc, all of which are actually executing on the server. Many people can open these sessions at once, and they all run applications in their own compartment so they don’t step on each other (assuming the applications are well-written in that regard, and most these days are). An interesting side effect is the ability to disconnect from a session and have your programs continue running, so you can reconnect later (even from a different machine) and continue where you left off.
There are also “Windows Based Terminals” which are made by companies like Wyse. This is a little box you attach a keyboard/mouse/monitor to, and all they do is allow you to connect to a Terminal Server – you don’t run applications on the client at all.
[li] Terminal Services is available on NT4.0 Terminal Server Edition (which is seperate from NT4 Server), Windows 2000 Server, and the upcoming Windows Server 2003. In addition, XP Pro offers TS in a limited fashion – you can connect to your computer remotely and run programs there, but only one person can use a session at a time (so you can’t have someone sitting at the XP Pro machine plus someone logged in remotely).[/li]
Note that a similar concept of running apps remotely but displaying them locally exists on Unix under the X Window System, but is about 100 times more complex. Under X, you have much greater control and flexibility over where a window shows up (you can conceivably have a single application where one window shows up on the server and another window shows up on the client), but configuring things to run remotely is a nightmare for the layman (for one thing, most people are very confused about the seemingly backwards “client” and “server” terminology – X considers the computer you’re sitting at the “display server” and the one where the application is running the “display client” – it’s a little like learning to drive a car while mentallly substituting “right” for “left” and vice versa). Nextstep had an application-remoting scheme as well, so I assume MacOS X (which is based on Nextstep) could be made to do it, although I’ve never heard any mac folks make any noise about it, so maybe Apple buried it or de-emphasized it in favor of simplicity.
Tuckerfan (and thank you, I didn’t know I had fans ), there are Remote Desktop clients for linux (I know of one called rdesktop) which will let you log in to an XP Pro machine and run programs there. However, you’ll find that apps which demand high video performance don’t work well at all via RDP. Many video apps won’t work at all, because they require special access to the video hardware in order to get the performance you need, and in a remoted case, the “video hardware” is emulated as a pretty simple device (meaning it doesn’t have much of the acceleration that the software has come to rely on in the fancy modern video hardware). Even if an app is written in such a way that it works well with the basic video capabilities available in a TS session, you probably won’t get the performance necessary to do full-screen video. It’s just not that practical yet.