Handing more RAM to onboard graphics?

I built my wife’s PC around Gigabyte’s GA-K8N51GMF-9 motherboard. It’s an AMD Socket 939 board that uses the GeForce 6100 on-board graphics processor, but it also has room for a PCI-Express card later on. Since the graphics are from the GeForce 6xxx family, they’re pretty good (not great, but pretty good). While researching all the various upgrade paths her machine can take, I realized that she’s got 1.5GB of memory on board that system, but her graphics card is only borrowing 64MB, which is the most it will borrow, according to the manual. My gut feeling is that something like a custom BIOS would be able to strong-arm the machine into sharing more RAM with the video card, but perhaps there’s even a utility to permit what I’m thining of – if there is, I haven’t been able to find it. So, two quick questions:

(1) What hardware or software on the motherboard and/or graphics chipset is preventing me from allocating more RAM to the graphics card?

(2) Would I notice any benefit if I were somehow able to overcome this obstacle?

My understanding is it would allow higher resolution and/or more colors at a particular resolution.

It would also give you better performance while playing games, because the video card would have that much more space to store texture maps - which are essentially images which are wrapped around 3-D objects to make them look more realistic.

I really doubt that 64mb is insufficient for any resolution/colour depth you’re likely to use, and any games that have texture sizes that would benefit from more video ram will likely be choking on that integrated graphics processor regardless. I wouldn’t bother going to any lengths to allocate more graphics memory. Though the old nForce 2 chipset board with integrated graphics that I have can allocate up to 128mb, so I find it a bit odd that a 6100 board would be limited to 64. The limitation will be in the BIOS, not in any higher level software.

If you really want better graphics performance, you’ll do better springing for an actual card. Integrated graphics has to share various resources such that a card even with just the same gpu will have better performance. Something like a 6600GT would likely result in a very noticeable improvement.

:smack: That’s what I get for posting without checking Device Manager. The graphics chip is borrowing RAM from x000A0000 to x000BFFFF – 128MB, just as you suspected. I’ll double-check the BIOS options and see if I can bug it for more RAM, but asking for more than 128MB starts to impinge on resources the PC is already using.

Yeah, I’m going to add a card to the machine eventually, but the system isn’t even a year old, and I’d like to wait until the GeForce 7xxx family is a little cheaper. I’m holding out for a double upgrade around Christmas: a 7xxx card and an Athlon X2 4800+. The prices on the Socket 939 chips are already down to 30% of their release-date price. I’m thinking they’ll drop just a wee bit lower throughout the fall, too, with perhaps a slight spike around Christmas.

Keep an eye on the news. With no new socket 939 models slated, AMD is going to be trying to push more and more production over to socket AM2 models. Eventually supply of 939 chips is going to tail off and prices will stop dropping. This happened to me with my current machine - an old Socket A job with a 2100+, which I’d planned to upgrade to a 3000+, but then they became scarce and unavailable and way overpriced relative to their performance. Happily I’m about to build a whole new machine.

Even if you can make the cpu give up the extra memory, what makes you think that the gpu is even set up to address more memory than that usefully?

Unfortunately, nVidia doesn’t seem to want to hand out their technical docs to just anyone, so I can’t check, but there’s probably a lot involved in setting up the gpu. If there’s an upper limit listed, it’s probably a real upper limit.

As you noted with your 3000+, when supply drops prices rise.

You will see no performance gain. And as someone else already pointed out, any game that utilizes higher resoution textures will choke on an integrated vidoe chip anyway. The lag of using non integrated VRam in graphics processing would drop performance even lower.

Integrated graphics cards are only for 2D applications and the occasional last generation (or two) games. The upgrade you are looking to get though will certainly improve that puppy up for some gaming goodness. You may want to wait for the first DDx10 cards to show up and snag up a 7950 GTX at what will likely be a decent price for some great performance.

Integrated graphics cards may not be able to play the newest and snazziest games, but they have no problem at all with 3D graphics.

I don’t have any choice to use non-integrated VRam – the integrated graphics use only the DIMMs that I’ve plugged in. But the GeForce 6100 has a vertex processor, supports Shader 3.0 and DirectX 9, and is definitely a 3D card. It gets just a little crunchy when she plays World of Warcraft in a big dungeon with lots of spells going off simultaneously, but that’s the most graphically-demanding thing she runs. Given that I just bumped it up to use all 128MB that it can handle, I think she’ll see a minor improvement. I probably won’t move on the 7950 until they’re down near $150 or lower.

So I’m with iamthewalrus(:3= on this one (and thanks, (:3=, for pointing out that addressing RAM isn’t the same as being able to use it - that’s probably the real answer to my OP).

Also, thanks for the warning, Gorsnak. Believe me when I say that I’m watching the prices obsessively. I check inventory and prices at Newegg and Zipzoomfly at least every other day, and read SharkyExtreme’s CPU pricing guide every week to see what the trends are with other chips. The X2 4800+ has shed half of its price this summer, and historically, chips don’t go below 25% of release price until they’re in the third-best class. Since AM2 and Core2Duo both outclass the X2 4800+, I know it’s coming soon. If I knew when the next big chip date was coming, I’d have a target date to look at – one more substantial price reduction is all it’s going to take to get me on board.

The big price drop was because of the Core 2 Duo release. Suddenly AMD didn’t have performance over Intel, so they slashed prices to stay competitive. That big price drop was an anomoly - prices should behave in a more normal fashion in the coming months. Also, AM2 chips are no faster than their 939 brothers, just that AM2 boards accept ddr2 ram and future developments will all be for the am2 socket. What will happen is that people with 939 boards looking to get a little more life out of them (like you) will want to upgrade to the fastest available 939 chip, which will result in higher demand for that particular cpu even as production of it falls off, leading to higher prices than one would otherwise predict for it. You might be able to get one used, but the same thing will apply. That’s exactly what happened to me on my socket A board. I would just watch the prices and note when the socket 939 X2 3800 and X2 4200s start disappearing from price listings. That’ll be about the best time to grab a X2 4800, because it’ll soon be on its way out too at that point.