You know, I’m not quite in the market for a new card, but if I were to grab one tomorrow I’d probably grab the 6800XT. I see Sapphire offering a 6800XT on Amazon for $540. I had a Sapphire brand Radeon card a number of years ago and it served me well.
My 1070Ti does what I need it to do for now though, and my money’s a bit tight anyway. Especially with Christmas looming.
Just re-ran across this reddit thread I saw a while ago that originally started me considering a 3070 because of this reply:
Is that true? If so, it moves the 3070 way up to the front of the line for me.
Google is indicating that the quietest 6800 XT cards are apparently around 40 dBA, which is much too loud for me. I’m looking for less than 30.
The 1050 ti I’m using now (MSI Gaming X) is 27 dBA at load according to this TechPowerUp review.
And then checking for the quietest 3070s, TechPowerUp has the MSI Gaming X Trio tied for quietest with the Asus Tuf (they don’t have an entry for the Rog Strix) at 28 dBA. Of course the Asus Noctua model is the quietest, but that’s like $200 more. The key with MSI Gaming cards for me is the low maximum RPMs on the fans. My 1050 ti tops out around 1300, I think, and that chart of 3070s says it’s 200 rpm less than the rest, 1500 vs 1700+.
At what point do we think 3070s will stop being sold new? Next summer?
As a wild swing, I’d guess this winter or next spring. Nvidia will want to get the 4070s out and I’d expect discounts on remaining 3000 series over the holidays to clear them out. Just a guess though.
Do they usually overlap? For example, when the 3070 was first released, could you still buy 2070s new? I’m guessing they do not overlap, but really hoping they do. Ideally with a nice price drop on the older gen.
Actually I can answer my own question right now. You can still buy 3090s, even though 4090s are available as well, I’m pretty sure. I suppose it could be different the lower down the stack you go.
I have no problem waiting an arbitrarily long time; I’m just excited that a silent 30 series I could eventually buy might actually exist. That whiny 3050 got me thinking that maybe no modern video cards will ever be quiet again. Certainly doubling the power consumption each generation doesn’t make them quieter, y’know? Was thinking I might be stuck with the 1050 ti for life. Still kind of thinking that, but the concept of a 3070 with an overpowered cooler gives me hope.
(Or I’d have to switch to water, but I strongly prefer air cooling.)
With the crypto crash and general decrease in demand for new GPUs, I think there’s more surplus cards than usual. Newegg, Amazon, Best Buy, etc are just sitting on old stock and trying to sell it along. That’s partially why I guessed late winter or spring – put that old stuff on Black Friday/Holday sales and get rid of it.
Buuuuuttttt… they haven’t even announced a 4070 release date yet so who knows. First they need to sand off all those 4080 12GB stickers
That reminds me, I’m really not in love with 8 GB on the 3070. Better than the 4 GB I have now, at least, but still not good.
I really wish I could mod them. I would enthusiastically just slap a couple bequiet fans on the 3050 I already have if I could and be done with it.
Wait, how do you water cool a video card? Are there special water cooled models that come without fans, or do you have to get a regular model and pull the fans off yourself? Are they soldered on? I’m realizing now that I know nothing about video card cooling.
I guess the Asus Noctua line is the answer for people like me. I just wish there was a bequiet version because all my fans are bequiet. And also that it didn’t cost an extra $200.
Oooo, check out the shiny new hardware tag. Nice work, @What_Exit.
There’s some video cards that come with AIOs attached (like mine) but they’re higher end cards. Likewise, there’s some conversion kits but they are (a) card specific since you need to match the layout of the GPU’s board and (b) again, for high end cards. No aftermarket AIO coolers for GTX 1650s. In theory, you could rig something up since it’s the same principle as cooling a CPU but that’s best left to tinkerers who aren’t too worried about messing up their card.
Video card coolers are detachable with a few screws and sometimes it’s worth it to replace the thermal paste or replace a busted fan. The fans tend to just plug into headers like your case fans do (but with smaller and more delicate headers). You usually have a block of aluminum with heat pipes running over the GPU die and thermal pads creating contact between the memory modules/controllers and the block. Taking apart a GPU is probably an intermediate skill task – harder than inserting memory, easier than delidding a CPU. It’s not that it’s “hard”, just that the risks are higher for messing up an expensive component. But I’ve done it a few times to rehab an old card and the issue is more nerves than actual difficulty.
For people holding out on a mid-tier 3000-series card, be aware that a few new variants of the 3060/3070 lines are coming out which are slimmed down from their originals. Or stripped down, depending on how charitable you feel. An 8GB 3060 with half the memory bus and a 3070Ti on the GA-102 die with fewer cores are a couple of ways Nvidia and AIBs are using up their remaining stock.
I think a lot of these wind up in prebuilts where they can just say “Gaming PC with an RTX 3060” but you might see some retail ones on store shelves in the future. Make sure your card is what you think it is.
I should add the new Radeon cards support Display Port 2.1 and AV1 encode/decode and max out at 350 watts of power (so no special power adapter needed).
Judging from what AMD revealed (and what they didn’t reveal), I’m guessing the same basic stance as the last few gens: Nvidia for performance per tier, AMD for price per tier.
A bit of downward pressure on Nvidia will be nice after the crypto-inflation thing but Nvidia will still have a market for people who just want the best so I wouldn’t expect big price drops on the 4090/4080. Won’t know until we see some benchmarks though.
I dunno if I’d go that far. Nothing is going to touch the 4090 since that card is a joke, but AMD might take the performance edge against the other Nvidia cards since their preview benchmarks were awful.