Curbing Your GPU's Power Use: Is It Worthwhile?

In many cases, the graphics card is the most power-hungry component in a PC. The enthusiast community is no stranger to CPU tweaking, so why hasn't GPU modification caught on? We're going to see just how much you stand to gain (or lose) from tweaking.


An increasing number of enthusiasts are becoming aware that their GPUs are the primary consumers of power in their PCs. For some, especially for our readers outside of the U.S., power consumption is an important factor in choosing a graphics card, along with performance, price, and noise levels. At the same time, as we begin to focus more intently on GPU power consumption, it is disconcerting to see the real cost of owning a high-end graphics card. If you read What Do High-End Graphics Cards Cost In Terms Of Electricity?, then you already know what we’re talking about. 

Now, you're probably wondering what you can do to help alleviate the issue. Graphics vendors like AMD and Nvidia build in technologies that help cut power use during idle periods, but the only surefire way to slash consumption is using a mainstream graphics card instead of a high-end model. At the end of the day, simpler cards based on less-complex GPUs require less power than their high-end siblings. 

You end up making sacrifices when you give up the displacement of a big graphics engine, though. Most mainstream cards don't offer enough performance to play the latest games at in the highest resolutions using the most realistic detail settings. If you want to play games completely maxed out, a high-end card is the only option.

Is there really no alternative to using mainstream graphics cards for the power-conscious? What if there was a way to manually cut down the power consumption of faster graphics cards? These are questions we ask (and try to answer) today.

A Short Overview of GPU Power Management

Although add-in graphics cards for desktop PCs don't curb power consumption as aggressively as discrete notebook GPUs, they still employ power management. In fact, power management technologies for desktop graphics cards have been available for quite a while. They usually manifest themselves as separate clocks for 2D (desktop) mode and 3D mode. Think of these as P-states on modern processors. With the availability of hardware-accelerated video playback, vendors have also added a new mode for video playback.

What's missing in most graphics cards is an option to limit power consumption, which is what the “Power saver” preset in the Windows Control Panel does. Enable this and the graphics card runs at lower clocks to keep power consumption down. There are new approaches to this problem. AMD introduced PowerTune for its Radeon HD 6900–series cards. We’ll look into the effectiveness of this capability later in this piece.

Finding the Right Combination

Lowering operating clocks is one way to reduce power consumption. This is as true for GPUs as it is for CPUs. However, clock speed (core and memory) is only one part of the equation. As with CPUs, the graphics processor’s operating voltage also plays a role.

If we wanted to limit or lower power consumption, couldn't we just manually set clocks and voltages? It’s actually really easy to modify frequencies using vendor-provided utilities and third-party software. And why not? Finding the right combination of clock and voltage can offer significant power savings.

Altering voltages is a different matter. Most graphics cards don't offer an easy way to adjust voltage. And in fact, in light of the fact that certain individuals have blown up their GeForce GTX 590s using unrealistic voltage settings, Nvidia even locks out voltage manipulation of those cards altogether. It’s not clear if that’ll apply to just the GTX 590 or a broader sampling of the company’s portfolio, but it demonstrates the bad that can come from too much tinkering.

How about the other cards out there that can still be modified? Unfortunately, voltage adjustments are limited to 3D-mode. Most cards do not offer a way to adjust voltages at idle or some intermediate mode.

But today we’re going to perform an experiment. We're going to see just how much power we can save by lowering clocks and voltages. In the process, we’re going to measure the associated performance hit to gauge whether those changes are worthwhile. We’ll be using two cards: AMD’s Radeon HD 5870 and 6970, with a 5770 for comparison.

View the original article here


Post a Comment