As I wrote yesterday, I’ve built a new gaming PC for home, and I promised to write some more today about why I chose certain components. I’ll write about two of them today, and talk a little bit about power consumption.
The first component I bought was the power supply. Power supplies aren’t very exciting components, and the main reason I picked this one is that I saw it on a special on Slickdeals. I knew that I wanted a power supply with enough power that I could eventually run two video cards in my computer (ATI/AMD calls this Crossfire) and this power supply can do that. Also, I wanted something relatively energy efficient, and since this one has the 80 Plus certification, I won’t be wasting a lot of power by using it. Antec is generally a quality brand, and this is a quality (and heavy) power supply, but it isn’t a “modular” power supply, where you only attach the cables to power the specific components needed in your build. As a result, I have a couple extra cables in my case, and it makes for a bit of a mess when you’re wiring it all up. If I was doing this over again, I’d wait a bit longer for a deal to pop up on a modular supply, as it would make the finished product look much tidier.
I spent a lot of time agonizing over which video card to purchase. There are a LOT of options for video cards, at many different price points, and typically the way I choose one is by finding the fastest card at the price point I’m willing to spend. I read numerous benchmarks, and fortunately, AnandTech is currently using Civilization 5 as one of their benchmarking games. I’ve been playing this game quite a bit lately, so it’s a very useful benchmark, and it was pretty clear that in the $150-ish price range, the NVidia GeForce 460 cards offered the most bang for the buck in Civ 5. However, I’ve also been mining some Bitcoins lately with my hardware, and for Bitcoin mining, the Radeon cards are the only way to go. They’re also sort of confusing to buy, as the performance doesn’t scale cleanly with price, due to the way that the mining software uses the processing power of the card. After pouring over a lot of breakdowns of cost, mining performance, and energy consumption, I decided on a Radeon 6870 card as being something that would perform well in Bitcoin mining, in games, and still come in at a price I could live with. It’s actually faster than the GeForce 460 I mentioned above in most games, except for Civilization 5, but honestly, if you’re not interested in Bitcoin mining, go for the 460, you can often find good deals on them listed on Slickdeals, and you can save at least $25 over the cost of the Radeon 6870, which ran me $165 after rebate. If you want to stick with the Radeon family, the slightly-slower Radeon 6850 is also a good choice, it’s just not nearly as good at Bitcoin mining as its bigger brother.
So, how much power does this new rig of mine use? Can I actually turn a profit on my Bitcoin mining? I plugged in my trusty Kill-A-Watt tonight to find out, and here are the results, not counting the monitor:
|State||Power consumption (watts)|
So, I’m clearly not stressing my 620W power supply yet, but these numbers let us easily calculate what it’s costing me to mine Bitcoins. Our power costs us about 7.8 cents per kw-hr, so when mining Bitcoins, I’m using about 34 cents worth of power per day versus leaving my computer turned off. At my current rate, I can earn a Bitcoin about every 4 days, and they’re currently trading at over $15 each, so I could make $13 every four days in profit. (Obviously, that heat goes somewhere, so my air conditioning will have to work slightly harder to dissipate that heat in the summer, but I'll save natural gas in the winter, so we’ll call it a wash).
Clearly, putting the computer to sleep is a good way to cut down on your power bill, but even leaving it idle isn’t going to break the bank, at a cost of only 13 cents per day.