As far as I understood, the two most important specifications for graphics cards, along with the memory of course, are pixel and texture fillrate, to dumb it down, pixel kinda decides at what resolution you can play with decent framerates and texture fillrate is how many high quality effects you can afford/4k textures and so on. You only need pixel fillrates to a certain amount after which they are not that important and then you are mostly looking at texture fillrate if you want to play on higher settings.
So, if I understood that correctly, what pixel and texture rates should I look for if I want to get 50 or 60fps in 1080p on high settings (not necessarily very high or ultra) in most modern games?
At the moment I have a old Radeon 7730 2gb, it’s about 3 to 5 times weaker than even a gtx 1050, so even if I can get modern games like Wildlands to open, I have to play on 1280x1024 or lower and on pretty much lowest settings possible, no antialiasing, no shadows, no ambient occlusion, bloom, hdr, low vegetation,…
I was looking at still old, but powerful Radeon 7970, but I think that it is a SLI card and I need one that is a single card and I can only afford older cards like that, so I have to go through each cards specifications, since there aren’t a lot of Benchmarks for older cards like that. At least a positive thing is that I don’t mind playing around 1280 or 1440, even the largest resolution for my screen is just 1680x1050, so I don’t need to reach full 1080p, I just don’t want to have to play on 25-30 fps on very low settings on low resolutions.