ChrisBFRPKY
Illuminator
- Joined
- Jun 14, 2012
- Messages
- 4,449
What was predictable was that if Bitcoin ever 'took off', mining it would require ever-increasing processing power - and therefore ever-increasing electricity use. The knock-on effects of this were also quite predictable.
But why must increased processing power equate to increased electricity usage? Because Moore's law is dead. Chip density doubling every 2 years? Doesn't sound that dramatic, but in just a few years that geometric progression has given us computers with a million times more transistors in them. To someone like me who was playing around with computers that had only a few kilobytes of RAM, the idea that I would be using a PC with Gigabytes of RAM in it was ludicrous - and yet Moore's Law had already predicted that this would happen.
But Moore's law is failing. A modern high-end CPU or GPU is vastly more powerful than the chips used in early PC's, but also uses a lot more power. The ever-shrinking transistor size was helping to keep power consumption down because smaller transistors can operate faster without using more electricity, but it could not keep up with the insatiable demand for more processing power.
The ever increasing processing power (and therefore energy input) required to mine Bitcoins is not a bug, it's a feature which its inventor and anyone who understood how it worked was well aware off. What they were perhaps not aware of - or chose to ignore- was the scale. I suspect most early bitcoin users either didn't do the math, or were unable to comprehend it - mining Bitcoins on the their PCs and thinking, "If these things take off I might need a more powerful PC", but not considering that when the price hits $1 million due to widespread adoption they will be competing against billion dollar mining pools sucking power directly from hydro dams. But if Bitcoin was to fulfill their dreams then this development was inevitable - and totally predictable.
How widely used Bitcoin will become may not be predictable, but the amount of electricity it will use when it gets there is. Market theory tells us that the 'industry' will find ways to lower its production costs, which in this case means getting cheaper power. So 'miners' will go to locations where they can get it (ie. China). And since mining is a business not an ideology, they don't care if it's cheaper due to government subsidies or slave labor or any other 'immoral' reason (even theft, if they can get away with it).
A modern CPU or GPU uses much less power with the newer generation chips. I've been into mining since 2013, not exclusively BTC , but I always trade any other coins I mine for BTC as the end result.
I've used CPU mining for CPU algo coins, GPU mining for GPU algo coins, and ASIC mining for SHA-256 and Scrypt algo coins.
As an example, when mining the CPU coin "ROI Coin" my I-7 950 cpu draws about 125+ watts to produce about 40 H/s mining power on that algo. In contrast my Ryzen 7 1700 draws about 65 watts to produce 800 H/s mining power on the same algo. So to get the same performance and work from the old I-7 950 I'd need 20 of the I-7 950 machines to produce 800 H/s with a total wattage draw somewhere around 2500 watts to do the same work the Ryzen 7 1700 does at 65 watts.........
The GPU and ASIC chips are the same way, much more work, much less power usage.
Chris B.