Far Cry 5 has a 5-7 fps spread between the 1070 Ti and 1080 at 4k ultra quality. The two cards really are very similar which made reviewers wonder what space the 1070 Ti was supposed to fill . It’s not a budget conscious alternative to the 1080 since they’re about $50 apart (MSRP anyway) but it’s also a significant jump ahead of the standard 1070. It’s almost like someone trying to nickle and dime the price down from Nvidia like a Craigslist deal: “Hey, so if I don’t need ALL these cores, can you keep a hundred of them and knock off fifty bucks?”
Really, my understanding is that the 1070 Ti exists because Nvidia wanted a direct alternative to the AMD Vega 56 which landed between the 1070 & 1080 in performance. This is why the 1070 Ti came out so late compared to the other cards – they didn’t need anything in that space until the Vega 56 launched and was outperforming the 1070.
Edit: BTW, I’m not any help on the monitor front and I’m using a 1080 screen and never bothered looking into 4K monitors
The 1070ti might very well have been prompted by the Vega 56. Sometimes, I wonder if these late models are introduced because the designers/manufacturers realized that a large number of chips baked better than expected but not quite well enough to be sold in the top bin. Example with made-up numbers: To be sold as a 1080, a GP-104 chip has to be 95% right at a minimum. To be sold as a 1070, a GP-104 chip has to be 80% right at a minimum. If you get a bunch of GP-104s that come out with 90%, it might be tempting to creating a new bin.
Don’t most rigs need significant power supply upgrades for said top tier cards?
Given an ultra fast and modern multi core processor in conjunction with such a card, are there still “bottlenecks”, like the BUS speed on a motherboard? Are there others?
Nvidia isn’t always superior to AMD; they frequently trade the top slot and best value slot. The current Nvidia-favoured cycle has lasted longer than usual though.
The GPU is pretty much the only thing that matters if you’re targeting 60 Hz. Any quadcore CPU since like 2010 is probably fine for 60 Hz. DDR3 or DDR4 wouldn’t matter. If you want constant 144 Hz, pretty much everything needs to be top-end unless the game is older. High-clock CPU and tons of RAM bandwidth.
Some games are just hot garbage that nothing can run particularly well, like PUBG.
In the last few years, the best AMD has been able to do is tie whether in the CPU or GPU markets. For example, the 390 was pretty good but it was equal to the 970 while a year late. I’ve heard about AMD GPUs being better at Bitcoin mining but I don’t know enough about it to confirm.
It wasn’t always so. I have a Phenom II x4 955 CPU and I used to have a 6850 GPU I got in early 2011 because they were great value.
Top tier cards are going to run at 250 to 300 watts. If you got a slightly beefy power supply when you bought your PC, it should be able to handle it.
Presuming an infinitely fast CPU and infinitely fast GPU, your bottlenecks are going to be the interconnects at various levels. RAM to CPU, VRAM to GPU, PCI from CPU to GPU, anytime you need to access data from your drive.
The biggest one right now is GPU (the chip) to VRAM (the onboard memory) because a graphics card is a bit like its own motherboard and it faces the same problems as RAM to CPU while having to transfer a lot more data. I expect that within 2 generations, we’ll be looking at 1TB/s of bandwidth and games will still be gobbling it up as fast as it can be served.
AMD cards are superior for mining but I couldn’t say exactly why. For the most part, AMD cards have provided less raw performance but better performance for the card price than Nvidia since they’re generally cheaper. That’s not really the case in this cryptomining market though.
I ended up picking up the card Friday evening. I was leaning towards getting the card, but thanks all for confirming that it wasn’t a bad purchase.
I finished buying all the parts for the system. I ended up paying more than I originally wanted to when I realized my over 7 year old rig just wasn’t cutting it anymore, even with upgrading the video card 4 years ago, but seeing as how I ended up going high-end, rather than the high-mid that my previous rig was, I’m not disappointed.
And, with many 1070’s going for more than this 1080 was going for (and every 1070ti going for more), and realizing that my old rig seems to be on its last legs, I didn’t feel like waiting and risking my old rig just dying out.
Interestingly enough, I had signed up for mass drop thinking that they sometimes can get decent deals on GPU’s because the general idea is they can have enough people request a certain item and based on the percentage of people who follow through on such things, they can get manufacturers of products to release a quantity at a discount due to the likely sales. A 1070 ti dropped Friday morning for $650. So yeah, I got the 1080 instead.
It’s probably the best 4K monitor you can get for the money if you want G-Sync.
However, it uses a TN display which while considerably cheaper, faster response, and more efficient than the newer ISP technology, sacrifices some color accuracy and viewing angles. The viewing angle thing won’t matter if you’re going to be sitting close to the monitor as is typical usage for a computer monitor as opposed to a TV. And the color reproduction is still better than a lot of other monitors. But, you won’t get the vivid color accuracy you will from an ISP monitor.
It’s a great all around monitor. It’s IPS, so it should have better color accuracy than the Predator. From what I’ve read about it, you won’t be disappointed. It’s a solid, no-frills monitor that rivals the high-end monitors for a fraction of the price. But, you’ll have a slower response time and you won’t have the advantages of G-syncs superiority over standard V-sync when it comes to smoothing and tear reduction.
There’s talk of the next generation GPUs being built in such a way that they would take a huge hit on crypto mining while still being good for graphics, and then also putting out cards that are specifically for mining. This is an attempt to get the pricing under control for this stuff.
I have to admit, even $600 seems wrong to me based on how long those cards have been out, That sounds like the initial MSRP, and I’d have expected them to go down a bit without the crypto crap. That said, it’s been a long time since I’ve looked at the GPU market. I just follow from afar.
(My rig has a GT 1030, which is all I really need for what I do. And even that was only because the low power and low heat requirments in a mini-case–otherwise I could have gone with an older card.)
I’m skeptical about how sincere AMD and Nvidia are about trying to separate the mining market. If they’re selling 100% of the cards they can make now, why split production between gaming and mining? They’re making record profits right now because of mining and running at full production capability as they do it. If they cripple mining on gaming GPUs then they’re taking away from their primary profit driver.
I could see them making a mining specific card that worked better (or at least as well) for mining but crap for GPU purposes and selling it ‘cheaper’ but at a larger profit margin. I have a harder time imagining them shooting their own foot by making gaming GPUs that don’t work for mining. There’s been a lot of lip service since this who thing started but very little real action and even the action has usually been pretty temporary.
Maybe they could make the exact same card and advertise and package it differently to fool the miners “Optimized for data mining” versus “Optimized For Gaming” and price it accordingly. Or, have essentially the same card with a couple tweaks designed to tip one way or another without having to seriously compromise assembly or profits. I don’t know. Maybe that’s not something that can be done with these.
I don’t doubt that they can make a mining specific card or maybe even make a mining crippled card. But, back in the pre-mining days when gamers were the primary market, they weren’t selling every card they could produce like hotcakes. Now they are. If you know miners are buying 100% of your stock and gamers were buying (let’s say) 85%, it’s hard to imagine them taking pains to shut the miners out of a product line. the only way that’s profitable is if you’re guaranteed that gamers will buy 100% or if you’re inflating the price to make up the difference.
When this whole nonsense started, there was a lot of lip service about “We’re here for the gamers” because no one knew if this was going to be some six month fad and they wanted to stay in the good graces of their core market. These days the core market probably is miners (for retail boxed cards anyway) and giving up part of the market is just giving it to the other team, AMD or Nvidia.
Market segmentation. Miners are on average willing and able to pay more for GPUs than gamers are. You sell the gaming GPU at the usual price and the mining GPU at a higher price.
I think they’ll have a hard time convincing miners that “You want our stripped down card for a higher price” without it somehow being super exceptional at mining. Likewise, if you cripple a GPU for mining, you’re taking a good bit of value away from it since many gamers DO do some mining with their cards (AMD in fact is encouraging this). Furthermore, if they are selling 100% of standard GPUs now, what happens when the mining GPUs sell out (since they have to make fewer of them due to production limits) and the miners can’t buy gaming GPUs as an alternative? Now they’re just leaving money on the table.
They could likely increase the base price for all the new standard GPUs and still sell them all so segmenting the market won’t accomplish much. If the crypto market crashes, you lower the base price on the standard GPUs accordingly.
I had signed up for notification from EVGA’s website for when they get cards back in stock. Apparently, they have one of their 1070 cards in stock…for $555.
That’s over $150 more than the the Founders Edition goes for. It’s interesting because when the Founder’s Editions first came out, they were typically $50-$75 above the prices of other versions. One of the reasons was NVIDIA didn’t want to anger their manufacturing partners (e.g. EVGA) by undercutting them pricewise.
So this EVGA card is arguably something that would have gone for no more than $350 back in 2016.
I don’t think it would be hard at all, if the gaming cards are particularly bad at mining, and can’t be trivially uncrippled. It seems the idea is actually different architecture, so that might do it.
And price discrimination is a tried and tested technique to get more for your money. Right now, they are losing out on a whole lot of the gaming market because the prices set by the miners are too high for the gamers. They have every reason to try and make a card that the gamers could afford without the price being driven up by the miners. There’s more money to be had by bringing in the people who can afford less.
They can likely increase the price a bit on the mining cards, too, but that’s only because they’ve pulled the gamers out of that segment, meaning they no longer push the price downward. And, yes, it does allow for the inevitable crash of the bubble to not touch their gaming prices.
As for having cards they can’t sell: meh. If they’re making two different types of cards, they can just adjust supply by making more of the other kind. As long as the same equipment can be used, that would still work.
The big problem would be if the gamer cards can be hacked or modded cheaply for mining. That would kill this idea entirely. (The mining cards being good for gaming would not be a problem as long as mining prices are higher, but might be a problem once the bottom falls out.)
The fundamental problem with the segmentation strategy is that GPU mining is a temporary thing. It doesn’t make sense to spend money re-tooling, developing firmware, or whatever else goes into it. Once the mining pool difficulty increases too much or ASICs come out, GPUs are done. We’re very close to that point on Ethereum, which was the main target of current GPU mining.
This was my impression for the second reason GPU prices were starting to fall a little—that ASICs were coming out that were significantly more efficient at mining calculations.
This is only if vsync is enabled. I generally recommend everyone run with vsync off.
The effect of g-sync/freesync does improve the perception of smoothness below 60 FPS still, but not because the framerate automatically drops to 30 (with vsync off).
Price segmentation makes sense if you have the capacity to make all these different models and you want to appeal to different markets. But the places that are making GPUs are at capacity - it would require significant capital investment and time to create new production facilities for these cards and they’re not willing to do that because when the mining craze dies that extra capacity is costly and unprofitable.
For them to make lower market cards would necesarily take away production from the higher market cards since we’re at full capacity. There’s little reason for them to do anything now except to produce as many cards as they can and sell them at the highest price they can.