New Nvidia cards drop - lots of cool new graphics tech coming down the pipe for PC games!

The new Nvidia cards just came out and man, are they making a splash. Specifically, their value for performance and their performance per watt are pretty amazing.

The 970 is going for $350 and it performs almost like a 780 ti ($600). It’s smaller and uses less power too! The 980 performs better than the 780ti and it is going for $500, also uses less power.

If you’ve been on the fence, or looking for a possible upgrade, the 970 I think is an amazing upgrade. Benchmarks have it pulling out near 60 FPS on battlefield 4 at 1440p! and around 30 FPS at 4K resolution!

Nvidia also announced some new features to it’s “Game Works” API, currently being used in games like Batman Arkham Knight, The Witcher 3, Assassin’s Creed Unity, Far Cry 4, Borderlands the presequel, and several others, including a new, more efficient MSAA filter, new phsycs and destructibility API’s and what I personally found most interesting: a full global illumination solution!

Can’t wait to start seeing these new tools being used by devs!

I was really disappointed by the 980 specs. It’s basically a side-grade for the 780 Ti, which you could also find discounted to about $570~ these days. I know the 980 is a little better, but it’s application sensitive. I can’t help but think that the 980 was released for Nvidia’s benefit more so than gamers’ benefit so they could get away from that expensive GK110 GPU.

The 970 is pretty slick though, which isn’t surprising given that it costs 2/3rds what the 980 does. Definitely the card most people should be thinking about, especially those 980 SLI people. 3-way 970s for less money? Sounds like a winner.

I’m sure that 980 price isn’t going to stand after AMD moves (next month?).

I just wish I would have sold my 780 ti before this, and used the money for two 970’s. :frowning:

I want to buy one because they proved man walked on the moon.

More accurately they provided yet another proof (not like we really needed any) that we really put people on the moon.

They used a real-time lighting model to see what the picture of Buzz Aldrin exiting the lander would look like (conspiracy theorists said Buzz should have been in shadow and not lit as brightly as he was).

Nvidia showed it had to be as it was. Better still, when they first did it their picture looked too dark. They couldn’t figure why for a bit then realize that Neil Armstrong was also a light source (his white suit reflected light). When they included him in the calculation it all worked with remarkable accuracy.

They also showed there were stars in the moon sky.

Check it out here: http://www.wired.com/2014/09/nvidia-moon/

This was all done on their Maxwell chip.

Thanks for posting that link, that was pretty cool to watch.

So what are the sort of resolutions these cards are good for? What resolutions would these cards be overkill at? I have a 1920*1080 monitor. I expect these cards are all well beyond my needs, say - 1080p at high(not ultra) settings for most modern games?

If OP doesn’t mind the slight hijack - Which is the cheapest card which can still be classified as overkill for 1080p at high settings?

These cards will handle modern games at 1440p fine, which only a few years ago pretty much required SLI/Xfire in order to run at anything like high settings. They’ll definitely fly at 1080p. Downsampling or serious AA filters (like TXAA) might even be an option for many games. And the high end 980 can do 4K at 30 FPS on many games at medium to high settings.

I would say the 900 series and the 970 in particular is the sweet spot for performance now and well into the future. 4 gigs RAM and beating the pants out of any console for around $340. That 4 gig buffer will be critical for keeping pace with texture quality in the future which is why I wouldn’t recommend some of the cheaper, but still very powerful GPU’s with only 2 gigs of VRAM, and most of the 3gig RAM ones, or the custom 4 gig 600 series or 6 gig 700 series cards are pretty expensive right now.

Yeah, Shadows of Mordor just launched and requires a seemingly obscene 6GB VRAM for ultra level graphics at 1080. You can, of course, play it instead at high graphics (3GB) or medium on down but the ceiling is way up there. Watch Dogs requires a minimum 3GB VRAM for ultra and I assume we’ll see other games going up as well.

This would be the system requirement bump following the new consoles, I guess. Of course, Medium PC graphics is still PS4 level graphics so you’re not required to run out and buy the biggest cards right now.

I’ll buy one in five years when there’re actually games that use this tech.

Assuming anybody cares about this tech in five years.

It doesn’t require 6 gigs for ultra graphics, it suggests 6 gigs for Ultra textures only. All other settings depend more on the GPU performance rather than VRAM. And the devs have said that Ultra textures are the unoptimized, full rez textures the artists originally created. High and below are the optimized assets. So it’s a cool little extra, but it doesn’t really change the IQ much at all, and it’s basically a “Hey we dumped the raw assets in there, enjoy!” thing, not at all representative of what you’d normally have in a game.

Finally, it turns out 4 gig cards are actually running the Ultra textures fine.

What tech? What are you talking about?

Everything in here is being used in games out TODAY, and many upcoming ones. The only exception is voxel based global illumination. Don’t know when we’ll see that in a game.

That’s funny: I’m really excited about it because it uses so much less power.

I’m sorry, but if they “just announced” a bunch of new features, how are those going to be “used in games out TODAY”? These two things are mutually exclusive.

I’d consider the ultra textures to be part of the ultra graphics but maybe they’re a separate download or something (like the Skyrim High Def pack was). My point was merely that you are seeing higher demands for the full bells and whistles package than you were seeing before.

What do you folks expect this will do to the prices of the more entry level cards in the coming months? Will it have any effect at all? Are there new cards based on the same chip and architecture expected to be introduced at lower price points as well?

Just to clarify, I don’t mean more entry level to be the absolute entry level. Say around the 200-250$ range

It’s about goddamn time for a new graphics card worth buying. I’m using a GTX460 I bought when they were on clearance right before the next generation, which was a damn good buy in retrospect. The last few generations of cards have been pretty underwhelming, with no reasonably priced substantial upgrades. But I also bought a 1440p monitor recently so that GTX460 is starting to have a hard time keeping up…

So I think there’s a 970 in my future once I have my pennies saved, and maybe once they start bundling the new Borderlands too.

AMD is going to have to slash their prices a lot to remain remotely competitive with the 970. Maybe you’ll be able to get a 290 or 290x for $250? Also, nVidia is going to launch the 960 by the end of October, which may end up being the best $250ish card you can buy.

Wondering the same after reading the OP. It left me puzzled.

As per the OP

Hardware developers let game developers know what’s coming up on the hardware scene so they can both develop concurrently, so that there’s a shorter lag time for adopting new techniques and new technology.

And there are lots of technologies that work with everything. GSYNC doesn’t require the game engine to do it, as far as I know, for example. And of course generally the improved horsepower lets things run faster and at higher res, better post processing solutions can be forced by the card independent of the game, etc.

I’m trying to get my hands on the MSI 970 card but they’re back ordered right now, waiting on the next shipment from China I presume.