The energy consumption of 93 graphics cards! - BeHardware
>> Graphics cards

Written by Damien Triolet and Marc Prieur

Published on January 17, 2011


Page 1


For a good few years now, graphics cards have been the component that draws most power in high end machines. Indeed they can consume up to 300 watts, while a processor that hasnít been overclocked generally falls somewhere between 50 and 100 watts, with the same for the rest of the configuration (motherboard, hard drives and so on). When you associate several cards in an SLI or CrossFire X, itís no surprise to see that your graphics solution uses between 50% and 75% of the total power of your system. So what do the various models consume?

Measuring the energy consumption of a graphics card
Last February, we introduced a new protocol that allows us to measure the energy consumption of the graphics card in isolation to the rest of the system. Measuring this value is quite complex because a graphics card draws power from two separate sources:

-The PCI-Express x16 port, which can supply up to 75W
- Additional PCI-Express connectors, each of which can supply 75W (6-pin) or 150W (8-pin)

Measuring the consumption drawn via the additional PCI-Express power connectors is relatively simple because voltage can be taken directly at the graphics card PCB and amperage with a clip-on ammeter. Itís more complex for the PCI-Express port, but doable, and we have been able to set up a system that allows us to take the energy consumption drawn by the graphics card from this source, whether from the 3.3V (generally negligeable and just a few watts) or the 12V.

Last February, we published the readings obtained across 79 cards in four different situations:

- In 2D, with the Windows desktop
- During the playback of Blu-ray H.264 (Power DVD + Chapter 3 of I-Robot)
- In the 3DMark06 Pixel Shader test at 1920*1200
- In the FurMark Stability Test in Xtreme mode at 1920*1200

We use two 3D load tests because, depending on the various architectures, the cards were scored differently in comparison to each other in the two tests. We have now stopped taking the Blu-ray reading as it wasnít giving us much useful data in comparison to the 2D test. This year, otherwise continuing with the same protocol, we are able to present the results obtained across no less than 93 cards!

Page 2
List of cards

Liste des cartes
Radeon HD 2900 XT 512 MB
Radeon HD 2900 GT 256 MB
Radeon HD 2600 XT 256 MB
Radeon HD 2400 XT 256 MB
Radeon HD 3450 256 MB
Radeon HD 3470 256 MB
Radeon HD 3650 512 MB
Radeon HD 3850 256 MB
Radeon HD 3870 512 MB
Radeon HD 3870 X2
Radeon HD 4350 512 MB
Radeon HD 4550 512 MB
Radeon HD 4650 512 MB
Radeon HD 4670 1 GB DDR2
Radeon HD 4670 512 MB DDR3
Radeon HD 4730 512 MB
Radeon HD 4770 512 MB
Radeon HD 4830 512 MB
Radeon HD 4850 512 MB
Radeon HD 4850 1 GB
Radeon HD 4870 512 MB
Radeon HD 4870 1 GB
Radeon HD 4890 1 GB
Radeon HD 4850 X2
Radeon HD 4870 X2
Radeon HD 5450 512 MB DDR2
Radeon HD 5450 512 MB DDR3
Radeon HD 5770 1 GB
Radeon HD 5830 1 GB
Radeon HD 5850 1 GB
Radeon HD 5870 1 GB
Radeon HD 5870 2 GB
Radeon HD 5970
Radeon HD 6850 1 GB
Radeon HD 6870 1 GB
Radeon HD 6950 2 GB
Radeon HD 6970 2 GB
GeForce 8400 GS 256 MB
GeForce 8500 GT 256 MB
GeForce 8600 GTS 256 MB
GeForce 8800 GS 384 MB (= 9600 GSO)
GeForce 8800 GT 256 MB
GeForce 8800 GT 512 MB
GeForce 8800 GTS 320 MB
GeForce 8800 GTS 640 MB
GeForce 8800 GTS 512 MB
GeForce 8800 Ultra 768 MB
GeForce 9400 GT 512 MB
GeForce 9500 GT 512 MB DDR2 64b
GeForce 9500 GT 512 MB DDR2 64b
GeForce 9500 GT 512 MB DDR3
GeForce 9600 GSO 384 MB
GeForce 9600 GSO 512 MB
GeForce 9600 GT 512 MB
GeForce 9800 GT 512 MB (= 8800 GT)
GeForce 9800 GTX 512 MB
GeForce 9800 GTX+ 512 MB
GeForce 9800 GX2
GeForce G100 512 MB
GeForce 210 512 MB DDR2
GeForce 210 512 MB DDR3
GeForce 310 (= GeForce 210 DDR2)
GeForce GT 130 768 MB
GeForce GT 220 512 MB DDR2
GeForce GT 220 1 GB DDR2
GeForce GT 220 512 MB DDR3
GeForce GT 220 1 GB DDR3
GeForce GT 240 512 MB DDR3
GeForce GT 240 512 MB GDDR5
GeForce GT 240 1 GB DDR3
GeForce GT 240 1 GB GDDR5
GeForce GT 340 512 MB (= GT 240 GDDR5)
GeForce GT 340 1 GB (= GT 240 GDDR5)
GeForce GTS 250 512 MB (= 9800 GTX+)
GeForce GTS 250 1 GB
GeForce GTX 260 896 MB
GeForce GTX 260 216 896 MB
GeForce GTX 275 896 MB
GeForce GTX 280 1 GB
GeForce GTX 285 1 GB
GeForce GTX 295
GeForce GTS 450 1 GB
GeForce GTX 460 768 MB
GeForce GTX 460 1 GB
GeForce GTX 465 1 GB
GeForce GTX 470 1.25 GB
GeForce GTX 480 1.50 GB
GeForce GTX 570 1.25 GB
GeForce GTX 580 1.50 GB

Page 3
The results

The results

Hold the mouse over the graph for a classification in order of load (FurMark/3D Mark average)

Page 4

With their recent generation GPUs, NVIDIA and then AMD have attached more importance to offering lower consumption cards (even high-end), with the exception of 3D load scores that is. Most recent cards score somewhere in the order of 20 to 30 watts when not in 3D mode.

Energy consumption in load peaked with the Radeon HD 4870 X2, which still has the doubtful record of being the highest consumption card to date. NVIDIA did get dangerously close however with the GeForce GTX 480 and 580, mono-GPU solutions which manage to consume up to 309 watts in extreme load!

Conscious of the bad publicity of having a graphics card that draws too much power, both NVIDIA and AMD have introduced solutions that aim to limit the energy consumption in load to more reasonable levels.

With the GTX 570 and 580, NVIDIA has introduced components which allow drivers (but not the GPU directly) to monitor GPU consumption. Thereís no overall monitoring and the driver simply activates its system (lowering of clocks beyond certain energy consumption levels) in the latest versions of Furmark and OCCT (the version of Furmark that weíre using wasnít detected), thatís to say in the extreme load tests used by testers to measure energy consumption. In practice, the NVIDIA system ends up placing limitations on such software that stops you from using it rather than maintaining energy consumption within a well-defined thermal envelope.

AMD has also integrated a mechanism, PowerTune, that aims to limit the energy consumption of cards. In contrast to the NVIDIA system, PowerTune is controlled by the GPU and is a global system. Beyond a limit fixed by AMD that would imply 200W or 250W on the Radeon HD 6950 and 6970 in the worst cases, the GPU lowers its clocks. You can modify this limit via the Catalyst Overdrive dashboard. In practice, on a Radeon HD 6950, if you increase the limit, you go from 159/161 watts on 3Dmark/FurMark to 166/185 watts and from 211/214 to 245/271 watts on the Radeon HD 6970.

Copyright © 1997-2015 BeHardware. All rights reserved.