GPU prices...Yikes!


Well, If Everett says so, I guess that’s the end of the discussion.


Think of the Renderfarm potential in the future. You’ll get your finished frames back before you can even hit submit.


You mean like using all those cards to actually do something useful instead of crunching numbers for the sake of it ?

Jokes apart, that’s the idea behind Otoy’s “Token” system, and probably one of the few good uses of blockchains.


Whether or not you think it’s a good investment or not…mining is the rage. And it’s driving some people to buy mass quantities of cards.


Think about the positives as well. If the major cryptocurrencies crash badly in the next months, there will be hundreds of thousands of 2nd hand GPUs on sale for probably peanuts on Ebay and similar sites.

That would force Nvidia and AMD to immediately slash prices on 1st hand GPUs - because if they don’t, everybody will be trying to scoop up the cheap 2nd hand cards instead.

Yes they can slash prices pretty hard in my humble opinion. I doubt that a 1080 GTX or similar card actually costs all that much to manufacture.


According to your own link, the top 3 most profitable devices are 2 gpus and an asic, The gpus can be had for a few hundred dollars, whilst the asic miner has a minimum order of 5 units ($8,000), still needs power supplies and racks, plus would need 7,000 watts of power to run. Plus you need to keep in mind resale value once the hardware is no longer profitable. A GPU can be resold for a significant portion of its purchase price. A custom made asic for one specific purpose is near worthless.

The single most popular system for crypto mining right now is the geforce 1060 due to the relatively lower power usage and wattage, that plus the fact you can sell them on at a later date.

Personally, whilst gpu is this popular, this would completely prevent me from buying second hand gfx card on ebay, an abused 24hour miner is going to have a much shorter life span than a gfx card used in a normal way as a video output.


Nvidia is less likely be affected by a flood of used videocards, the Volta lineup is ready to roll, Miners and gamers will jump on it. On the other hand AMD Graphics division would suffer greatly, they have nothing in the near future susceptible to spike interest and would be facing alone a cut throat second hand market filled with their current gen products.


I don’t know…Aside from fan failure, GPUs are pretty rugged, IMO. Fan failure can be mitigated.


I gotta say I disagree a bit. I think they are rugged yeah but from all the components in your computer I think the GPU is in the middle / top of the list of components that might not last that long. That is especially so if you are using overclocked versions in the 80-90+ degree range.

From my experience its usually the boards that “die” first. You can do some resurrecting if you bake them in the oven (literally) but its not a long term solution. I don’t recall reading much about the actual GPU (the processor, not the card) failure but VRAM and the board seem to be on the hit more often.

That said I think it depends on what you mean with longetivity :slight_smile: Generally its pretty common for a CPU to last you 10+ years but obviously there are the “lucky” few who draw the short end. Graphic cards on the other hand… I rarely hear of that kind of longetivity because there are multiple components at play and its pretty common for most components to heat up to like 90 degrees.

Bottom line, for what they are I think GPUs are rugged. If you’d ask me about whether a GPU on average outlasts a CPU in terms of life span? I’d say no :slight_smile: Ultimately I recommend getting water cooling for GPU rigs or any of the Hybrid combos. Personally I tend to think that if you can keep it below something like 75-80 degrees when rendering then you can expect at least 4+ years out of it. I came up with those numbers from my ass (experience) so don’t ask me for sources lol :slight_smile:


The most likely damage would occur in a situation where there is constant thermal expansion and contraction of the hardware. Physical matter grows slightly when heated, and shrinks slightly when it cools down again.

If you had a crazy application that heats a GPU to 90 degrees, then lets it cool to 20, then takes it up to 90 degrees again, and back down to 20 degrees and on and on, hundreds of times a day, the hardware may over time become unreliable and experience failure.

I don’t know whether a GPU engineered to run at 80 degrees constant for a 12 hour gaming session without damage happening would actually get damaged if you kept it at 80 degrees 24/7. Electronics components are supposed to be over-engineered slightly to cope with this sort of load.

AFAIK, electronics running at a constant temperature that is within design limits do not become damaged by that temperature as long as they are engineered properly for that temperature.

But the number of times a GPU can go to Max heat, cool down to Min heat, and go back up to Max heat may indeed be limited. Do that cycle 100,000 times or 200,000 times, and maybe your GPU will indeed crap out.

The fan is another matter. Depending on whether its a cheap 3 Dollar part or a more sturdy 8 Dollar or 12 Dollar part, and whether the fan is cleaned at all and so on, the fan may indeed die through prolonged use.


Well you see I don’t think its just heat per say. Some if it is just the components and their durance. Basically it can be anything from a slightly too thin copper wire to moisture in the air that in combination with the heat levels does it.

Depending on whether you think its marketing bs or truth there is also that manufacturer statement that Geforce cards (I presume Radeons RX too) supposedly aren’t built for 24/7 operation or stuff like that. Personally I am not quite sold on that but hey :slight_smile:


It may be the case that Quadros and similar use slightly tougher or higher-quality materials and are actually intensely stress-tested for say 50 hours before they are sold to professionals.

A GeForce card that can’t take that 50 hour super-stress test may still be of high enough quality and reliability to sell to a gamer. At worst the GeForce may overheat after 3 - 4 years of use and force a reboot during a gaming session.

A Quadro or other Pro card that fails the stress test or other quality testing after manufacture would likely go in the garbage bin and not get sold at all.

If you want to get conspiratorial, of course, you could reason that maybe GeForce cards are deliberately under-engineered to fail under certain types of very intense loads, or use a cheaper production process and cheaper build materials that makes GeForce cards slightly more prone to failure or errors.

This actually happens with many physical products where you have a cheap model, a medium price model, a full price model and also a super expensive Deluxe model.

If you buy the el-cheapo TV, dishwasher, oven, vacuum cleaner or washer-dryer model from even a very well known brand, quite often it isn’t all that long before a repairman has to visit your house and swap out a circuit board or other component of the product to get the appliance to work again.

You “get what you pay for” as they say. The cheap models aren’t built for reliability - they are built to be cheap.


A GPU might not last 10 years…but do we really care? GPU technology is changing much more rapidly than CPU tech. I suspect most of us upgrade our GPU’s by the time they reach 6-8 years old.


I guess you’d definitely want them working for at least 3 years I suppose. Whats wrong with having extra render nodes anyway? :stuck_out_tongue:


it seems to me that it makes little sense to mine bitcoin et al, at all, ATM. A gfx card, can of course mine, so can a cpu,but at current prices, all mining does, is consume electricity, and cost money, no matter how you do it. None of the big commercial farms use graphics cards. They All locate where electricity costs are minimal, as that is the main cost.
I looked deep into the idea of mining. However, I live in Germany, and need to pay at least 21 EUR cents for a kWh, for commercial, volume supply. That is more than double the average quoted price in the US. it makes no sense at all to mine, in Germany, with anything, at current prices. We pay more for everything required, and have a tax liability to consider. Prices have to go beyond 20k to make pennies… Basically, pretty much everyone who went nuts investing, in hardware when BC peaked, is now losing money. In Germany it makes no sense at all, full stop.Not even at 18k. To mine legitimately, as an enterprise, makes no sense at all, here. Anyone using a gfx card to mine right now, is paying for the privilege. it might seem like money, but you may as well just buy your currency, as it costs more to mine it, than to buy. It can not even be classified as pocket money, unless you completely ignore all your real costs…
Imagine that you try to mine commercially, and that you have bills and tax to pay. At what point does this become a viable thing to do? It is is fundamentally wildly speculative, and you have to be naive, and willing to kid yourself to want to mine right now, in particular in Germany( and probably anywhere in the EU). It probably makes more sense, in places with less regulation, and lower operating costs.
Anyhow, I stand by my point that using a gfx card to mine, is nuts. But I do recognize that lots of people are indeed doing it. so that makes it not a myth. However, I amend with with, “mining anything, with anything, is currently nuts” I would not buy a graphics card, for the purpose of mining, and I suspect that those that have, regret it, and are already realizing the futility of their endeavour.

[check this amusing video, about a guy who tried to make a gpu mining rig. (skip forward to 14:00, for the reality of it all.)

having said all that… GPU’s everywhere… seems profitable at 8 cents per kWh, with scale.](


Current inflated GPU prices are 100%, indisputably & soley the result of GPU mining.
It doesn’t matter if there was a point a few years ago where that wasn’t true.
It’s the cause now.
A year ago, I bought a 1080ti for £600. Even 6 months ago, I could just about get one for £650. Now they start at £840.
Cryptocurrency has managed to create forms of exchange that:

  • Are even less based on the real economy than existing complex financial derivitives
  • Are especially useful to organised crime
  • Result in pointless environmentally damaging waste power consumption
  • Are even more subject to instabilities & fluctuation than most existing forms of speculation
    The only answer imo is for nation states to excersise their power to ban them. Hopefully South Korea’s recent moves point a way forward.
    Not because it matters too much if we over-pay for GPUs but because cryptocurrencies have no redeeming features in society at large (except perhaps as an interesting experiment).


Bitcoin is a classic BUBBLE, where people lusting after “free, easy money” without understanding economics are rushing in, thinking BC may go to 20K or 30K or even higher.

How did this happen? All those “YOU CAN MINE BITCOIN TOO!” promotional sites online were probably set up by people who already had thousands of Bitcoins or even more at hand.

THEY are the ones cashing in now - everybody else is inflating their Bitcoin wallet for them with the current mining craze.

Pretty much everbody loses. The people who started the craze sell their Bitcoins and win.

If you got into Bitcoin 4 - 5 years ago, you may indeed have made big money by now. But its the people driving the current BUBBLE that made those Bitcoins worth so much.


The bubble is clearly going to burst. I just hope the idiots who bought up these GPUs just for this purpose burn horribly when it does—and I hope its soon!
To be honest I couldn’t care less about this bitcoin/cryptocurrency crap. What i do care about is how this affects my job.

This random price explosion poses a major challenge for studios who rely on this hardware and were planning on greatly expanding.
Guess we should have pulled the trigger a month or 2 ago rather then waiting until this quarter.


Every machine I’ve ever had fail, every machine of a friend I’ve had to go and fix, has been a dead gfx card every single time. I’ve probably sorted 10 dead workstations over the years (about half macs) and have thrown 10 dead gpus in the trash*

Generally I’ve used the dead GPUs as a kick to go upgrade a system.

*proverbially. I oven baked or ebayed them off for beer money.


Popularity of ehtereum mining affects some gpu markets. Personally I like ASIC mining because it´s more effective than GPU which you have to buy several to get same result as ASIC. ASIC might be more expensive though.