ConclusionUpdating our new power consumption protocol and applying it across no less than 73 cards allows us to present the most comprehensive power consumption data currently available for graphics cards. Covering a large majority of the NVIDIA and ATI DirectX 10/11 ranges for the last 3 years, this data allows us to draw several conclusions.
The first is that over the last 3 years, power consumption of graphics cards has remained fairly stable. Thus, while the GeForce 8800 Ultra consumed 192.7 watts, the GTX 285 is at 208.3 watts and the Radeon HD 5870 at 195.4 watts. Only bi-GPU cards scored higher consumption levels, with the 4870 X2 scoring a sorry 321.8 watts.
In 2D and Blu-ray playback however, graphics card manufacturers have made great efforts to reduce power consumption: the high-end cards of 2007 drew more than 70 watts here, while a 5870 is at 19.6/42.8 watts and a GTX 285 at 29.5/62.2 watts.
At entry level, power consumption has also remained relatively stable. Overall, NVIDIA and ATI have pursued the same goal, namely to increase performance within the same thermal envelope, rather than reducing energy consumption overall. Note however the good energy performance of the Radeon 5400s, which, as we’ll see in a forthcoming report covering the subject of performance / watts, is down to a 40nm engraving without an increase in performance on the previous generation.