I know, that looks really promising! Hoping the real world reviews show the same thing. Not so much for MSFS, for a civilian based flight sim, 30 fps is all you need, but I am hoping to get into DCS this winter. My son has been pushing me to actually start flying and he wants to go on missions together. Multiplayer DCS is a hog on resources and a 4080 would really help out there too.
There’s no reason to get one of these if you are playing at 1080p, probably at 1440p (unless you have a very high refresh rate monitor) or if you’ll be CPU throttled anyway. That eliminates the large majority of PC owners. The xx80 series has always been the top end gaming card with the xx90 nominally for actual “work” purposes. You don’t buy a top end card unless you have a top end system to use it.
The mid-range models are the xx70 cards and maybe the xx60Tis. Those are cards capable of maxing 1080 and doing very well at 1440p and maybe some 4k if you’re just weird like that. As more people eventually upgrade to 1440 or 4k, maybe this will shift but we’re not there yet. Right now, only 20% of gamers play above 1080p and over half of those are at 1440p (so a 3070Ti or 3080 would suffice). You don’t make “middle of the road” cards for 10% of the market.
Hopefully they’ll be smart enough to buy a card that’s practical for their needs (which will probably be a 4060) but that’s on them. Or buy an AMD card depending on how those price out if Nvidia’s gonna be dumb about it.
(I should add that I’m not defending Nvidia’s pricing which I think is too high and hope AMD forces them down with their pricing. Or defending their inane double 4080 scheme. My point is solely that most people, by the numbers, shouldn’t be buying these high end cards and the ones who do shouldn’t be balking at needing a new PSU since that’s the way the entire high end market is going, including new motherboards)
There are also new enhancements like ray tracing that you can only do in newer generations of cards (or, at least, do reasonably well). Granted, not many games benefit from ray tracing but if you are someone who cares then you really need a newer generation card.
And is Nvidia even planning on an actual 4060 card? They did not mention it at their event (unless I forgot).
Nvidia always releases the lower tier cards well after the big launch of the flagship cards. 4090 and 4080 are flagships. 4070, 4060, etc won’t come out until late winter or spring, probably.
The 3080 launched in September 2020, the 3070 six weeks later in late October, the 3060Ti in December and the 3060 not until February 2021 (and the 3050 in early 2022!). I suspect they’ll be delaying the 4070 on down cards due to surplus high end chips from their TSMC deal fiasco. They’re not going to talk up later cards because they want you to buy these cards, not have a reason to think “well, the 4060Ti sounds like it’ll be more my speed so I’ll wait”. Which is the answer your average person with a 1080p screen and an i5-10600 should be thinking.
Apparently enough backlash on the 12GB “4080” variant has convinced Nvidia not to release it after all. Presumably, it’ll be back with new branding as a 4070Ti or 4070 or whatever.
The RTX 4080 12GB is a fantastic graphics card, but it’s not named right. Having two GPUs with the 4080 designation is confusing.
So, we’re pressing the “unlaunch” button on the 4080 12GB. The RTX 4080 16GB is amazing and on track to delight gamers everywhere on November 16th.
Guess they were worried too many people were going to opt for the $899 card instead of the $1199 card.
So in watching the Gamers Nexus reviews for the 4090 I learned my something new: stay away from founder’s editions cards (at least pre-40 series FEs) because they generally have shit thermals. For some reason I used to think founder’s edition cards were the best, but apparently they’re the worst.
Or at least they were, until Nvidia went ham on the thermals for the 4090. So much so that EVGA said fuck this, I’m out.
The 3000 series Founders cards had decent thermals, if I remember right. Enough so that I remember people/reviewers saying “Hey, these are actually respectable” after the hot noisy performance the 1000 and 2000 series FE cards offered. Nvidia has a weird love for a blower design for its coolers that operates about as subtly as a leaf-blower or vacuum cleaner. I’m not surprised if “bad FE coolers” was back for the 4000 series.
The opposite. Thermals on the 4090 founders edition are supposed to be excellent. So good that thermals may no longer be a selling point for buying third party partner boards, though of course noise profile still is. And I guess RGB?
Still, Steve from Gamers Nexus wondered whether the excellent FE thermals were a contributing factor to EVGA noping out of the market entirely.
I missed your “(pre-4000)” note. Totally my fault. Good to hear that Nvidia has actually learned how to cool a card. I suppose people will buy AIB cards for availability (esp if Nvidia does the stupid FE only through Best Buy again) and “factory overclocking” and aesthetics and just plain not knowing better.
Don’t know about the 4000s but AIB partners were working at a disadvantage with the overtuned 3000 cards that were sucking up more power (and generating more heat) than they needed to run at their performance levels.
I saw a review of the Asus ROG Strix 4090 (I cannot remember where now) which is a massive, three-fan card. All air cooled 4090’s are quite big but this is one of the biggest I have seen.
While its thermals were better than the Founders Edition it wasn’t a whole lot (like 4 degrees at the top end). It made less noise than the FE but again, not by a lot (like 2 dB). Gaming performance was nearly identical (to within the margin of error).
Personally, I like a blower fan setup better. Get that heat out of the case.
EDIT to add: I forgot to mention. The Asus is $1,999 so a lot more expensive.
Whatever they are in reality, they’re pricing and branding them like enthusiast cards. Hence only doing 80 and 90 cards. Those numbers mean “enthusiast” and “bragging rights enthusiast,” respectively. Gamers who care about “bang for the buck” don’t tend to buy those.
They very much seem like they’re saying that most people should just buy the 3000 series cards and wait. Given the glut in the market, that makes sense to me. I suspect this may be a generation where the TIs are cheaper.

Gamers who care about “bang for the buck” don’t tend to buy those.
No we do not. The 70 series dies just fine.

The 70 series dies just fine.
What are you doing to those poor cards? You monster!
Seriously, though, how is a 3070 for 1440 gaming? Max settings on a AAA title would be what, 60 to 80 fps? Just trying to get a sense of it. (I see from the next post quoting you saying you have a 1070 TI. Just a general question to anyone.)

My 1070Ti runs everything at max settings smoothly on my 1080p screen.
Ditto. But then I don’t play shooters.
May upgrade someday, but not this year. I have to say I am enjoying the increased useable lifespan of PC chips/graphics cards these days. Maybe I’ve just been lulled to sleep by my silent SSDs, but my aging PC still feels more or less as robust as when I spec’ed it out in late 2018.

Seriously, though, how is a 3070 for 1440 gaming?
I may tell you someday, I’ve been eyeballing them…

Seriously, though, how is a 3070 for 1440 gaming?
It’s a solid choice. Probably around 60-90fps in more intensive games at 1440p with max settings. Better on older games and competitive online titles.
I wouldn’t buy one right now, though. The RX 6800XT is the same price and a healthy 25%+ performance boost over the 3070, provided you’re not into ray-tracing.

I wouldn’t buy one right now, though. The RX 6800XT is the same price and a healthy 25%+ performance boost over the 3070, provided you’re not into ray-tracing.
Even if you were, is a 3070 sufficient for 1440p with ray tracing? I feel like you’d be lucky to hit 60 FPS with ray tracing on.

Even if you were, is a 3070 sufficient for 1440p with ray tracing?
Not natively at 1440p. You could probably squeeze 60+fps out of stuff using DLSS, provided the game supports it. As a ceiling, I saw a benchmark video of a 3070 running Cyberpunk 2077 at 1440p, Ultra settings, Ray-Tracing on (Ultra) and DLSS on (Quality) and it was getting ~50fps. You could get that to 60+ by inching down from Ultra to High/Medium and probably not lose much in the process.
But, and I’m not someone dismissive of ray-tracing, I’d easily take an extra 25% performance from the 6800XT in basically all games for my $550 (as of this post) than getting the 3070’s ray-tracing in a handful of games while being behind in everything else. The 6800XT is a card designed to compete with the 3080 but currently at the price of a 3070.

But, and I’m not someone dismissive of ray-tracing, I’d easily take an extra 25% performance from the 6800XT in basically all games for my $550 (as of this post) than getting the 3070’s ray-tracing in a handful of games while being behind in everything else. The 6800XT is a card designed to compete with the 3080 but currently at the price of a 3070.
Interesting.
More compelling to me is that the 6800 XT comes with 16GB of VRAM compared to the 3070’s 8 GB. Double the memory is quite a bit better.