My husband is a trader at a small hedge fund, and at work, he has six monitors at his desk. This is a pretty common setup nowadays, as it lets him monitor lots of different things rather than constantly switching between windows. There were a bunch of extra monitors kicking around the office, and he got permission to take a few home to create a four-monitor setup, for when he’s working on something in the evening. He’s got them nicely lined up on his desk, but neither of us has any idea how to actually go about setting up his computer to work with all of them. It’s a newish Dell PC, but whatever was the most basic version being sold six months ago. All of the monitors are Dells as well, and the graphics card is an ATI Radion HD 3200. I’d like to surprise him with a fully functional setup - do I need to buy another video card? If so, where do I put it? Will I need to download extra drivers? [embarrassed expression] Where does one even plug in more than one monitor [/embarrassed expression]
I am mostly technologically ignorant - I know enough to, say, be able to find and download new drivers for something if it’s giving me problems, but have never actually installed any hardware more complicated than a new printer, or been able to seriously troubleshoot anything.
We’ll need to know the model number of the PC in order to see what types of solutions are available to you.
But basically, most low end video cards (GPU’s) can support two monitors at the same time. Meaning that yes, you would need to add another card to your setup to make 4 work. but we would need to know if your PC has another expansion slot to accommodate two cards.
Another option would be to purchase an eyefinity 6 card from ATI and use that. It can support up to 6 monitors at once.
The cheapest one I found was the HD5870 Eyefinity 6 edition. But that’s a $280 part that needs a power supply that can adequately power it.
So yeah, we’ll need the model number of that dell.
This isn’t going to be that hard to do. Virtually any modern computer can accept an extra non-gaming class graphics card. Even a closed up netbook or nettop can take advantage of a USB video adapter. Many computers, especially ones made in the past few years, have built-in dual-display capable video adapters on the motherboard. In that case, you would just need to add an extra one into the computer’s PCI-E slot. Such a graphics card, like a Radeon HD 5450, is very inexpensive and costs less than $50. We’d need to know the configuration of the current home computer.
Looks like there’s a PCIe X16 slot in there available. So assuming this isn’t the “s” slim line form factor all you need is to follow Cleophu’s advice. A cheap modern card like the ATI 5000 series will host three monitors through eyefinity. You can then use the built in video out for the fourth monitor.
Model number of the home computer. If it’s a custom system, the model of the motherboard. If you can’t find this information, download and run CPU-Z and give the model under the Mainboard tab.
The specs say there is one PCIe x16 slot, one PCIe x1 slot, and two PCI slots, plus a 300-W power supply. The existing ATI Radeon HD 3200 appears to be integrated (that is, built onto the motherboard).
So you should be able to pop another video card into the PCIe x16 slot. The problem is, can the power supply give enough additional power to support it?
I suppose you could plug the computer into one of those liittle power measurement units, and then turn the computer on, but that wouldn’t necessarily give you the maximum power usage. For that you’d need to be running some program that take a while to complete and would max out the processor, like transcoding a video segment.
The specs simply say, “300 W”. Nothing about how much power the existing hardware uses. There is a maximum heat dissipation given of 1023 BTU/h (!) but the manual says that’s calculated using the wattage rating of the power supply.
The service manual doesn’t seem to have this information either, so some experimentation may be called for.
We should also note that much depends on which integrated Radeon this system has - Dell offers several, some using system memory and some with their own intergrated RAM. If they’re using system memory, again, dual-heading off that card will create a performance hit.
It’s not going to be an issue, or even noticeable for desktop productivity. Typically a lower-end card will reserve 128-256 MB of system memory for exclusive use by the video hardware. The basic VRAM use of a 1920x1080 monitor, for instance, is about 7 megabytes (1920x1080 pixels * 32 bits per pixel). So even four monitors would present no challenge to the available VRAM assuming they could all be hooked up. The additional VRAM is typically used in 3D applications to store textures, shaders, etc., but that’s not one of the applications the OP indicates will be needed.
A card like the GeForce 8400 GS will be great and is still easily available. Most of them have VGA, HDMI, and DVI ports all on the one card, with the DVI port adaptable to VGA if needed. The power requirements are unlikely to be a problem. With max load estimates of 65 W for the processor, 70 W for the entire motherboard chipset, 30 W for the hard drive, and 10 W for RAM, we’re at 175 W for the base computer.
Look at this overview of measured full-load power consumption for video cards. The full load consumption for a GeForce 8400 GS is about 24 watts. Not a big deal. Even the max possible draw for a PCI-E slot of 71 W will keep the system under 300W, and this card uses less power than the factory-optional Radeon HD 3650.
There might be a setting in the BIOS to keep the integrated video active while also having the 2nd video card installed. From what I found online setting the integrated card as the primary might work. I don’t have a Dell so I can’t test this to find out if it works, however.
Well IIRC you are pretty much emulating a VGA card in RAM with these units so you do have some CPU and memory overhead, but for basic desktop office type stuff these things are great. I was using one for a while as a second monitor for helper sites in wow while main display ran game. Worked great.
Just out of curiosity, is there still a way to adapt an AGP board to a PCI-E? I know there were products coming out 5 years ago when PCI-E was just starting to take off, but I haven’t ever actually seen anything. I’ve never wanted to go quad-monitor, but I’ve got a decent AGP dual-output board that I’d love to do something with other than throw it in the trash. Heck, the OP could have it if she can get the adapter.