Nvidia has officially revealed their RTX 4000 series lineup of PC graphics cards. Opinions?

Today (September 20, 2022) Nvidia officially rolled out their RTX 4000 series cards.

It seems they have three cards on offer. A 4090 for $1600 and two 4080’s for $900 (12GB) and $1200 (16GB).

The 4080 really seems like a 4070 and a 4080. It is more than memory. The cheaper of the two is nerfed in other ways compared to the more expensive 4080 (notably much fewer CUDA cores).

The pricing seems a bit outrageous. $1600 for the flagship? Also, cards like a 4070 and 4060 are conspicuously absent. My guess is they would be on par with the 3000 series cards which are flooding the market currently and prices are plummeting on those. Nvidia needs those to clear out first.

It also seems a worry that these cards are likely to need a power supply upgrade to run and not just any power supply but new ones that have appropriate plugs. I am not sure those are out yet.

We’ll need to wait on tests (looks like these hit the street on October 12 so testing videos should pop-up then) but the claims are dramatic. Nvidia says 2-4x faster than 3000 series.

Opinions?

It sounds to me like they’re trying not to compete with the 3000 series cards, by pricing these way out of proportion to them. This pricing basically says “this is for those who must have the latest and greatest.”

That said, I don’t believe they were ever planning to launch the lower tier cards this year.

The cards lower in the stack (70, 60, etc) always come out later so no surprise there. Pricing is stiffer than I’d like but the last couple of years have been a test bed for what people will pay.

If it’s really 2-4x better (and more 4 than 2), I would consider a 4080 over my 3080ti once benchmarks are out and prices come down a little. But I can comfortably wait and see how it shakes out.

Does that willingness include a power supply upgrade?

I forget where but I saw 450w for these cards (I think).

I already have a high quality 850W supply which is over the 750W they’re saying for the 16GB 4080. So I’m probably good.

I’m also probably a little less sensitive to some pricing than the average because I have a kid I pass “old” hardware down to or else enjoy building systems from my old hardware and selling it off. So if I needed to get a new PSU, for instance, that wouldn’t be an immediate deal breaker for me.

I didn’t notice at first glance how far apart the two models of 4080 were. Yeah, the 12GB version should be a 4070Ti or something rather than have people assume the main difference is 4GB of VRAM.

From this Reddit thread - https://old.reddit.com/r/Games/comments/xjbnbg/introducing_the_nvidia_geforce_rtx_4090_an/

At launch MSRP:
1080: $599
2080: $699
3080: $699
4080: $1199

Seems a bit disappointing, which is probably Nvidia’s goal. Expensive enough that you don’t want to, but not so bad that you’re not thinking about it.

The performance claims are ridiculous and probably based on the DLSS 3.0 frame insertion stuff. It’s easy to get great frame rates if the GPU invents them. You can tell the claims are nonsense because they used Flight Simulator as their reference point, which is an absurdly CPU-bound game. $900 is a lot to ask if the 12GB 4080 is only 30% faster than a $700 3080.

See how it looks in six months when maybe the oversupply discounts start coming in or Nvidia realizes they’re competing more against used crypto cards than they like.

I like that the 4090 seems like an actual flagship. The 3090 was a dud in comparison.

Still better than the 2000 series launch where the 2080 underperformed the $200 cheaper 1080 Ti.

I found the 3000 series cards to be overpriced for what they offered and this seems like more of the same. They’re really leaning into machine learning applications which might be the future of graphics but also it’s an exploding field of research. It’s cool that these cards can run Stable Diffusion well but that’s not actually what I want in a new video card, I wonder how much longer until folks turn to ATI for cards designed to render video games and Nvidia further pivots to making AI platforms.

I’m upset about DLSS 3, which is (partially?) hardware-boosted on the 40 series cards so won’t be physically available on DLSS 2 cards (20 series and 30 series.) It’s supposedly a massive jump, either half again or double either the frame rate or the improvement. I forget the details. Regardless, a huge jump.

So that really takes the shine off my dream of picking up a 3070 late this year / early next year. The reason I want a 30 series is DLSS, but it seems like the difference between not having any vs DLSS 2 is equivalent to the difference between 2 and 3. In other words, getting a 20 or 30 series card is a half measure in terms of capitalizing on DLSS, cutting my perception of their value substantially.

I mean, it’s almost enough to push me to start looking at AMD cards. But I mean with AMD I think the DLSS equivalent is all software based. Meh. Hardware better.

That’s the point. It only helps in CPU bound games, like open worlds with huge draw distances. And it’s supposed to help in that situation immensely.

This seems to be the case based on a graph Nvidia added to their overview page showing the performance in some non-DLSS titles (RE8 and Valhalla). The 3090 Ti reference point is about 20% faster than a 10GB 3080. The reality is that DLSS just isn’t that prevalent yet so making that the sole justification for buying isn’t a great plan, I think.

I’m not sure I want to see the kind of tower that can support that card without the PCI slot snapping off of the motherboard, nor can I imagine the game that would require it.

Filling out the table a little more:

Card Launch MSRP in 2022$ % Rise
980 2014-09-18 $549 $687 -
1080 2016-05-27 $599 $739 7.6%
2080 2018-09-20 $699 $824 11.5%
3080 2020-09-17 $699 $800 -2.9%
4080 2022-11-01 $899 $899 12.4%

*Using the cheaper 4080 model because Nvidia’s naming scheme is stupid.

Cites: 900 Series 10 Series 20 Series 30 Series 40 Series

“Require it?” There aren’t any. No one is going to make a game exclusive to the top tier card. Technically, the xx80 series is supposed to be the high end gaming enthusiast card and the xx90 series is supposed to be more for production and AI learning and that sort of thing (to help justify the cost) but, of course, people will still buy the xx90 cards to have “the best” and Nvidia will be happy to sell them.

I’m curious about the heatsink on that card. There’s only one fan that I see and assume you could get by with fewer fins and two or three fans cooling them rather than relying on one fan and a cubic foot of milled aluminum. I suspect it’s an intentional choice for show and AIB cards will be in the 2-3 slot range rather than requiring four.

The single fan card you see is Nvidia’s Founder’s Edition card. Nvidia themselves make and sell that video card (to the annoyance of AIBs). There are standard card layouts like you are used to seeing (see below) but they are chunky cards. All seem to be at least three slot cards (and may be 3.5 - 4 slot). I think the Founder’s Edition is four slot.

Yeah, that was my assumption. The Founders Edition card being a four-slot was an intentional choice since there’s more streamlined ways of cooling a card than one fan and three slots worth of aluminum fins. I suspect it was intentionally designed that way to be excitingly chonky but the Tweet was acting as though this was the norm for 4090s. Founder’s cards having different cooling than the norm seems to be pretty common with previous generations using a blower style cooler than few AIB cards utilized.

Judging from the photo you posted and others I saw, it’s more likely to usually be a three slot card. Which is still big but not different from the 3090s.

<< deleted by poster…re-post >>

I think the cheaper version is more akin to a 4070 and the expensive one is the real 4080. There is such a big difference between the two that it is not the gap between a 4080 and a 4080TI.

If we use the more expensive 4080 cost of $1,199:

Card Launch MSRP in 2022$ % Rise
980 2014-09-18 $549 $687 -
1080 2016-05-27 $599 $739 7.6%
2080 2018-09-20 $699 $824 11.5%
3080 2020-09-17 $699 $800 -2.9%
4080 2022-11-01 $1199 $1199 33.3%

I see they forgot the SLI bridge connector. How am I going to use 8 slots’ worth of Nvidia hotness?