What gain in FPS is there when playing on 1680x1050 opposed to 1080p?

I have an older 1680x1050 monitor (let’s call it 1050p) and since I have a very weak GPU, up to now I couldn’t play even on that resolution, but instead only on terrifyingly small ones like 1280x1024 and lower, however if I do set the resolution to 1050p, the picture is of course much, much better and I wouldn’t mind playing on that if I had a decent FPS (I currently get bellow 15) and I am finally getting a new GPU soon, so I’d like to know what kind of speeds can I expect in relation to 1080p benchmarks?

For example if the new card I’ll get can reach exactly 60 FPS on ultra settings on 1080p, what speed would I approximately get on 1680x1050?

At the moment I can barely afford a new GPU, so upgrading to a 1080p monitor is out of the question, but by buying the GPU I will be essentially be able to upgrade from 1280x1024 to 1680x1050, which is a huge improvement by itself.

Here’s what I saw switching between some resolutions using a game I have installed with built-in benchmarking.

Arkham Knight (min/max/avg)
1920 x 1080 = 112/209/162
1680 x 1050 = 115/226/167
1600 x 900 = 113/243/171

The average framerate difference would probably be much greater if I had a slower GPU. I think I’m CPU-bound here. I also tried Civ 6 and Total Warhammer and those showed no change so they were definitely CPU-bound.

Looking at the change in max FPS, I think the absolute best case scenario is a 15% increase in frame rate between 1080p and 900p. I use 900p to maintain the aspect ratio. Drawing conclusions from one game isn’t so great, but it’s a start.