It’s really annoying that GPU prices are jacked up for crypto demand. I’m sure the graphics card companies are luvin’ it, but it’s really annoying as a consumer. Is it possible to have a “pure graphics” or a “pure crypto” line that could separate them out for specialization and make everybody happy?
You could by gimping the number of general purpose compute units, double width FPUs, and doing certain memory optimizations. Indeed, one of the former best value/power tradeoff cards, the 970GTX, is known to be really bad for CUDA work because of certain memory optimizations and concessions made for gaming.
However, the issue is developers are starting to use all this stuff. Compute units for particle effects, physics, or any number of things, doubles for various hacks (albeit this one still isn’t used much), and memory for bigger textures. It’s really hard to justify these gimps and it’s rare one that’s good for gaming but bad for mining comes about “naturally”. The 970 was a rare exception. Especially since real time ray-tracing is the new hotness. (Don’t expect it in games for many years, and probably at least a generation of cards, but the new cards will support it in an infantile state)
I should mention that a lot of this is due to the fact that over the past 10-15 years, card developers have found that having generalized units in the GPU is actually faster than dedicated units if done right. Back in the fixed pipeline, and even early programmable pipeline days*, cards used to have a small number of compute units, a lot of vertex units, and a lot of fixed fragment units. As time has moved on they’ve moved more and more towards an architecture that’s just a lot of compute units with really good hardware schedulers, because it ends up being faster.
- Don’t worry about what this means
As an aside, the rise of graphics cards (at the time, driven by games) was a huge boon to scientific computing, since a lot of the same features that make a processor good for video-game graphics also make them good for simulations.
The answer may be to make a dedicated mining card, without the graphics optimizations as opposed to a graphics card with out the stuff that mining needs.
However there may not be any incentive to do that as it would cut into the windfall they enjoy and the price of the mining optimized cards would need to have a advantage over buying a cheaper graphics card to justify the higher price.
Those are coming out, such as the ASUS Mining P106. It has no graphics output connectors at all. If I understand correctly, the GPU itself on this board is not a standard NVIDIA chip, but one optimized for cryptocurrency mining.
Note that there has been a notable fall in pricing as demand for new cryptocurrency gear has declined. One complicating factor has been the cost of memory which is still has a notable effect.
That article covers prices until July. I expect with the even more dramatic fall in cryptocurrency things will soon return to normal (with many cards selling below MSRP).
The question is: are the idiot “home” miners still going to add equipment? The big datawarehouse miners are already using quite a bit of specialized gear and aren’t likely to expand. But the number of people who think Bitcoins are the get rich scheme for them is amazingly large. (It turns out I’m semi-related to one such bozo. So far he’s successfully turned real money he desperately needs into excess heat.)