Not sure if this should go here or General Questions but is gaming related so here it is.
I’m playing a lot of multiplayer COD:World at War at the moment, but I’m intrigued by something in regards to the level of resolution I am able to play it at.
Playing single player I can quite happily run it smooth as silk on a 1024x768 resolution.
But when I switch to on-line multiplay I have to convert down to 800x600 to get an acceptable experience - the higher resolution jumps and stutters too much. I can’t figure out why that is? It’s the same graphics card doing the job, rendering the game on screen. The information going out and coming in over the modem for the multiplay is a seperate data stream presumably handeld by the CPU? Why would a change of resolution on my box affect the speed of the connection?
Just because your graphics card has a GPU doesn’t mean that your CPU sits around doing nothing when it comes to putting images on the screen. The GPU offloads a lot of things but there’s still fundamental functions that have to be done at the CPU. Even on the ideal level where the GPU is doing as much as possible the program still has to go through the CPU essentially saying “I’m going to run this instruction over at the GPU”. In practice the GPU handles some graphics duties while the CPU handles others.
Adding players adds a lot of complications that can slow your game down. It’s just that simple.