Energy consumptionWe measured the energy consumption of the different cards, keeping in mind that there’s often a variation between identical cards of the same model, due, among other things, to leaks in current and different GPU voltages. Moreover, some run at higher clocks as they are factory overclocked.
We measured energy consumption directly on the graphics cards themselves.
At idle, we observed an average energy consumption of 32W for all the GeForce GTX 580s. Note however that this climbed to 40W for the Gigabyte SOC model when the OC Guru software wasn’t installed, as the card's power stage doesn't go into idle when it's not running. Note the clear advantage for the Asus Matrix Platinum which probably comes from the fact that the power stage is particularly optimised to this effect. This is moreover also why the card’s GPU temperature is the lowest.
In load, the difference between the cards increases and the situation becomes more complicated. It should be noted that there’s a relationship between GPU temperature and energy consumption as higher temperatures increase current leakage. A well-cooled GPU will consume a little bit less while a badly cooled GPU can find itself in a vicious circle. The same goes for the power stage, which can see its yield fall when it heats up. These reasons probably explain in part why the energy consumption of the Classified, Phantom and SOC cards explodes when you switch from 3DMark to Furmark, which is moreover tested in a version which doesn’t launch OCP.
On the other hand, the MSI Lightning is the most economical GeForce GTX 580 in load.