Report: AMD Radeon HD 6990, the card that breaks all the records - BeHardware
>> Graphics cards
Written by Damien Triolet
Published on March 10, 2011
First planned for December, then January, then February, the new captain of the AMD fleet was finally released at the beginning of March. The reason for these delays lies in the complexity of the production of what is a monster card, using two Cayman GPUs, similar to those that equip the Radeon HD 6970. The goal is clear: to replace the Radeon HD 5970 at the same time as underlining AMDís lead over the GeForce GTX 580 and getting ready to face down the GeForce GTX 590.
Bi-GPU: now a tradition
Since the Radeon HD 3000 generation, a bi-GPU flagship product has been released with each new graphics card range: Radeon HD 3870 X2, Radeon HD 4870 X2, Radeon HD 5970 and now the Radeon HD 6990. Most of these cards have allowed AMD to gain an advantage over NVIDIA, who, by designing enormous GPUs, has found it harder to develop bi-GPU versons and must often fall back on a die-shrink for a lower energy consumption revision before creating its bi-GPU cards.
With Cayman, our test for which you can find here, AMD has however aimed higher than usual. While the results in terms of pure performance are mixed, power consumption is going up. In spite of this, AMD persisted in the idea that a bi-GPU version remained on the agenda, codename Antilles, with the PowerTune technology to manage energy consumption levels.
To recap, PowerTune uses numerous sensors placed on each GPU block to monitor the activity levels of the GPU and then estimates energy consumption levels according to what is expected for particular activity levels. Given that there can be significant variations from one sample to another of the same GPU, AMD has based its energy consumption to activity levels predictions according to worst case, namely GPUs that suffer from high current leakage and run at a high temperature. Depending on the sample used, real energy consumption may therefore be similar or lower, sometimes by a lot, to that estimated by the GPU, which cannot measure real energy consumption.
At the same time, AMD sets an energy consumption limit for the GPU. Once this limit is exceeded, the GPU gradually reduces its clock and then increases it again when energy consumption drops. The GPU can thus adapt its clock to respect a given thermal envelope in real time.
We thought that AMD would use PowerTune to market a solution that would offer the best possible performance within a thermal envelope of 300W, the maximum as defined by the PCI Express 2.0 standard. In fact, AMD has gone beyond this limit, probably to give the card higher performance levels than the Radeon HD 5970, which is based on GPUs with a slightly better power efficiency. You can therefore choose to limit your Radeon HD 6990 to either 375W or 450W! This is a record and the subject of a fair amount of discussion as a result, especially as it is not clear what AMDís guarantee covers the card forÖ
Specifications, the Radeon HD 6990
While the GPU on the Radeon HD 5970 was clocked down on the Radeon HD 5870, the Radeon HD 6990, in 450W mode, has the same GPU clock as that on the Radeon HD 6970. In 375W mode however, it drops from 880 to 830 MHz, a 6% reduction. In both cases however, the memory clock has been reduced from 1375 MHz (2750 MHz for data) to 1250 MHz (2500 MHz for data). The Radeon HD 6990 comes with 2 GB of memory per GPU, giving it a big advantage over the reference Radeon HD 5970 and allowing gamers to enjoy extreme resolution gaming without the card being limited by insufficient memory.
The reference Radeon HD 6990AMD supplied us with a reference Radeon HD 6990, which is probably the only model which will be available in stores, as well as photos of the card taken apart (front of the PCB and cooler), telling us that the phase-change thermal interface would become ineffective if we took the card apart. We therefore left it untouched:
The Radeon HD 6990 is in exactly the same format as the Radeon HD 5970: the PCB measures 29cm and the cooler casing sticks out another centimetre, taking the total length of the card to 30cm, which means it wonít fit in all casings. Thankfully, the power supply connectors are on the top of the card and not at the back. Note also that itís 2-3mm higher than usual formats.
AMD has gone for a single DVI Dual-Link connector with four mini-DisplayPort outs. This means it can drive up to 5 screens without using DisplayPort 1.2 and 6 with. AMD says that the basic Radeon HD 6990 bundle will include a mini-DP to HDMI passive adaptor, a mini-DP to DVI Single-Link passive adaptor and a mini-DP to DVI Single-Link active adaptor. It will therefore be possible to use up to 3 screens that donít support Display Port straight off. Only one will however be able to use the DVI Dual Link (2560x1600 or 1920x1080 120 Hz).
AMD has revised the cooling system for these two GPUs. While itís still based on a blower, the blower is now positioned in the centre of the card and expels air in both directions: half goes towards the extraction grill and the other half towards the back of the card. This design means that both GPUs are on an equal footing in terms of cooling and that theyíre separated from each other. The space between them is occupied by the PCI Express PLX switch as well as by the power stage which is made up of four phases for each GPU and two phases for the memory of each.
The GDDR5 memory is from Hynix and is certified at 1.25 GHz, the clock it runs at on this card. AMD say that they select the best Cayman GPU samples for the Radeon HD 6990. Stricter criteria imply that there will be less variation between any two GPUs selected and that energy consumption should be closer to the limit fixed for PowerTune.
These GPUs have been validated at 880 MHz with GPU voltage at 1.175V and 830 MHz with voltage at 1.120V. Like the Radeon HD 6950 and 6970, the Radeon HD 6990 has two biosí and a little switch to go from one to the other. Here AMD has decided to use this functionality to offer two different modes. Bios 1 has a GPU clock of 880 MHz and a PowerTune limit of 450W and Bios 2 a GPU clock of 830 MHz with a PowerTune limit of 375W.
These two energy consumption values exceed the limit defined by the PCI Express 2.0 standard but the second does respect the per channel limit:
- PCIe bus: 75W
- 6-pin connector: 75W
- 8-pin connector: 150W
With two 8-pin connectors, the Radeon HD 6990 can at least respect these standards running at 375W. With the first bios however, AMD goes beyond these specifications, which could pose a problem with certain power supplies though most powerful monorail models can manage this load without any problems. Note however that although AMD says that it has validated the card with bios 1 and recommends the press to use that one, it will be protected by a sticker on cards that are put on sale in storesÖ which brings us to the question of the guarantee, a delicate point on which AMD isnít communicating clearly, simply saying that any overclocking disqualifies the guarantee. We can therefore conclude that, from AMDís point of view, using this Ďrecommendedí bios voids the guarantee.
At CeBIT, AMDís various partners werenít apparently up to speed on this system of two levels of specification. They were all very embarassed when they discovered it, especially as they werenít in any position to answer our questions on it. Some were extremely irritated to be put in this position by AMD. While it looks as if no returns will be accepted on cards where bios 1 is causing a problem (this may be due to a poorly adapted system), what will happen if the card dies while bios 1 is being used and the sticker has been removed? This will probably depend on the manufacturerís policy - they will be able to decide whether or not to go beyond the AMD guarantee.
Energy consumption, noise
Energy consumptionWe measured the energy consumption of the graphics card on its own. We took these readings at idle, in 3D Mark 06 and Furmark. Note that we use a version of Furmark that isnít detected by the stress test energy consumption limitation mechanism put into place by NVIDIA in the drivers for the GeForce GTX 500s.
While energy consumption at idle is the same as on the Radeon HD 5970, and pretty reasonable for a card of this calibre, it explodes in load. Strangely, PowerTune seems to function differently in the two biosí, something we imagine is linked to GPU temperature. The higher the temperature gets, the more current leakage there is, which has a direct effect on energy consumption. We can therefore imagine that AMD hasnít fully taken this higher temperature into account and hasnít been particularly conservative in defining the activity to energy consumption estimation ratios in the 450 watts mode.
This means thereís more of a margin, either for overclocking the GPU or in situations of extreme load as is the case with Furmark. Thus in 375W mode, we measured the GPU clock as varying from 535 to 830 MHz, with an average of 560 MHz, while in 450W mode, the GPU clock oscillated between 670 and 880 MHz, with an average of 765 MHz. The gain in performance in Furmark was therefore a little more than 20%! It should be noted that PowerTune acts in an extreme way here, because of the load of the test in Furmark. Games donít push the GPU as hard in terms of energy consumption and then PowerTune hardly kicks in at all.
Note that in spite of its extreme energy consumption, the Radeon HD 6990 does not exceed the maximum consumption defined for the PCI Express bus. While the standard authorises cards to draw 5.5A from 12V, the maximum level observed with the 450W bios was 5.32A. Thereís therefore no risk of damaging your motherboard in the way that there is with the GeForce GTX 480 and 580.
Noise levelsTo observe the noise levels produced by the card, we put it in an Antec Sonata 3 casing and measured noise at idle and in load. We placed the sonometer 60 cm from the casing.
Although the Radeon HD 6990 is discreet in idle, in load it breaks the record previously set by the GeForce GTX 480 in SLI and scores over 59 dB. Here the fan is running at full speed (4900 RPM).
Note that in 375W mode, noise levels vary between 52.8 dB and 59.4 dB. The fan runs at 3600 RPM at first and then, as the temperature goes up, accellerates progressively to 4900 RPM, with temperatures dropping after several tens of seconds and then the fan gradually dropping back down to 3600 RPM before the temperature starts to go up again and the cycle starts again. We therefore imagine that if your casing is well cooled, you should be able to keep the fan running at lower speeds. In any case, itís a long way from being quiet!
TemperaturesStill in the same casing, we took a temperature reading of the GPU using internal sensors:
In 450W mode, the turbine reaches maximum rotation speed and the temperature climbs up above 100 įC.
Hereís what the infrared thermography image gives:
HD 6990 at idle
HD 6990 375W in load
HD 6990 450W in load
As the Radeon HD 6990 releases half of the hot air into the casing, the components around it heat up, particularly hard drives, which suffer quite a bit as you can see in our report on the thermal characteristics of graphics cards, which has been updated with the full results for the Radeon HD 6990.
Drivers, the test
A bug catalyst?Over the last few months, there has been a significant increase in little bugs in the AMD Catalyst drivers. The installer tells you for no reason that the INF file was not found, Catalyst Control Center wonít launch under the pretext that no driver has been installed when in fact everything is in order, options that disappear, options linked to previous architectures which are added on top of those of recent cards, different parameters displayed than those actually activatedÖ and so on.
We've stopped adding up the amount of time lost due to all these little issues. Unlike NVIDIA, AMD doesnít offer a driver sweeper system in its installer and these problems often lead you down a dead end from which it isnít always easy to extract yourself.
No doubt many of these issues are linked to the numerous changes of graphics card that we carry out during testing, but they do nevertheless show that thereís a problem in how some driver functions are coded, or rather how the Catalyst Control Center is coded. At a time when AMD is getting its control panel ready for a wider user base with the introduction of its Fusion systems, it would be no bad thing to tighten things up and offer a more robust solution.
For this test, AMD supplied us with drivers at the last minute and there were several CrossFire X performance issues. While previously this mode gave very good results in Starcraft II and F1 2010, this was no longer the case with the Catalyst 11.4 beta drivers. AMD has corrected the problem on F1 2010 via an update of the application profiles (Catalyst 11.2 CAP3) but not for Starcraft II. Here again, itís somewhat disquieting to see additional multi-GPU bugs appearing when you might expect, as time goes by, to see its usage become entirely transparent. It looks as if AMD might have rushed these drivers out, which is hard to understand given the fact that we didnít come across these problems on launch of the Radeon HD 6950 and 6970 in December and that AMD had three more months to work on them before the arrival of the Radeon HD 6990.
Surround gamingGiven the rather high end status of the graphics solutions weíre covering in this report, we wanted to take a look at how they respond in surround gaming. To do this we used three screens at 1920x1080 for total resolution of 5760x1080.
On the AMD side, all Radeon HD 5000s and 6000s support surround gaming using Eyefinity technology. The same goes for all the CrossFire X solutions. Note however there's a limitation in terms of support for DVI/HDMI/VGA connectivity: the Radeons can only feed two of these outs. For the additional screen or screens, you have to use the DisplayPorts and an active adaptor (supplied with the Radeon HD 6990s) if your screens donít support DisplayPort natively.
In the NVIDIA camp, things are somewhat different as any one GeForce can only pilot two screens, whatever connectivity youíre using. For surround gaming, you therefore need an SLI system, or a bi-GPU card. You can then connect two screens to one of the cards and the third to the other.
AMD and NVIDIA offer similar features to create a large display area, which will be seen as a single large screen by Windows or by games, which facilitates support for surround gaming. Both drivers allow you to modify the position of the screens (this means you donít have to fiddle around with the outs so as to get the screens in the right order) and introduce bezel correction to avoid any gap as a result of screen borders. We didnít activate bezel correction as it increases resolution, from 5760x1080 to 5920x1080 for example, and introduces an additional load.
Note that while the Catalyst Control Center has given us a few problems recently, the Eyefinity page has never caused any trouble at all, in contrast to the NVIDIA drivers which seem very capricious on this point. Once the large surround surface area is set up with the NVIDIA solutions, everything works perfectly, but you do have to manage to get that far!
What with all the system crashes, driver control panel crashes, display issues and so on, we had quite a time! After numerous attempts and changing of connectors, we finally managed to get surround gaming going, without finding any logic or similarity in the three systems tested. We imagine these problems are linked to the use of HDMI (used for two or three of the screens) and to models customised by partners, as the connectivity then differs from reference models. In any case, it looks as if NVIDIA didnít check out enough combinations in its validation process.
The testFor this test, we decided to abandon Need for Speed Shift and Arma 2, which are limited by the CPU with graphics cards of the calibre of those tested here. The tests were carried out at 1920x1200 with MSAA 4x and MSAA 8x, as well as at 2560x1600 with MSAA 4x, except for Crysis and Far Cry 2 which were tested at several additional resolutions. We also tested all these solutions at 5760x1080 with and without MSAA 4x. All settings were pushed to a maximum. The most recent updates were installed and all the cards were tested with the most recently available drivers.
We decided to stop showing decimals in game performance results so as to make the graph more readable. We nevertheless note these values and use them when calculating the index. If youíre observant youíll notice that the size of the bars also reflects this.
The Radeons and the GeForces were tested with texture filtering at the high quality setting. All the Radeons were tested with the Catalyst 11.4 beta driver (8.84.3 beta 2). All the GeForces were tested with beta 267.31 drivers.
Test configurationIntel Core i7 980X (HT deactivated)
Asus Rampage III Extreme
6 GB DDR3 1333 Corsair
Windows 7 64 bits
Forceware 267.31 beta
Catalyst 11.4 beta (8.84.3 beta2) + Catalyst 11.2 CAP3.
To test Starcraft 2, we launched a replay and measured performances following one playerís view.
All graphics settings were pushed to a maximum. The game doesnít support antialiasing which is therefore activated in the control panels of the AMD and NVIDIA drivers. Patch 1.0.3 was installed.
A problem that appeared with the latest drivers affected performance in CrossFire mode.
To avoid any cheating, the Starcraft 2 developers have prevented the use of surround gaming.
The Mafia II engine passes physics handling over to the NVIDIA PhysX libraries and takes advantage to offer high physics settings which can be partially accellerated by the GeForces.
To measure performances we used the built-in benchmarks and all graphics options were pushed to a maximum, first without activating PhysX effects accelerated by the GPU:
Here, in 375W mode, the Radeon HD 6990 is on a par with the Radeon HD 6950 2GBs in CrossFire.
Next, we set all PhysX options to high:
With PhysX effects pushed to a maximum, performance levels dive. Note that they are in part limited by the CPU, as not all additional PhysX effects are accelerated. The Radeons are of course a long way behind.
Mafia II can detect when a large display area is made up of three monitors and registers them as 3 x 1920x1080 and not 5760x1080, which enables it to adapt the user interface automatically. For reasons unknown to us however, this detection system fails with a Radeon HD 6990 + Radeon HD 6970 pairing (triple CrossFire X) as well as with the Radeon HD 5970. The interface is then unusable and the benchmark mode is only displayed on one screen.
On the GeForce side, activating antialiasing has no effect. We imagine that NVIDIA must have blocked the activation of antialiasing in its drivers to prevent the impact of lack of memory.
Crysis Warhead replaces Crysis and has the same resource-heavy graphics engine. We tested it in version 1.1 and 64-bit mode as this is the main innovation. Crytek has renamed the different graphics quality modes, probably so as not to dismay gamers who may be disappointed at not being able to activate the highest quality mode because of excessive demands on system resources. The high quality mode has been renamed as ďGamerĒ and the very high is called ďEnthusiastĒ. This is the one we tested.
The Radeon HD 6900s do pretty well here and benefit from having 2 GB of memory at 2560x1600 AA8x.
The GeForce GTX 560s in SLI havenít got enough memory to launch Crysis in these conditions. The same goes for the Radeons in CrossFire when they are only equipped with 1 GB of memory but only when antialiasing is activated. Triple CrossFire comes in very useful here!
Far Cry 2
Far Cry 2
This version of Far Cry isnít really a great development as Crytek made the first episode in any case. As the owner of the licence, Ubisoft handled its development, with Crytek working on Crysis. No easy thing to inherit the graphics revolution that accompanied Far Cry, but the Ubisoft teams have done pretty well, even if the graphics donít go as far as those in Crysis. The game is also less resource heavy which is no bad thing. It has DirectX 10.1 support to improve the performance levels of compatible cards. We installed patch 1.02 and used the ďultra highĒ quality graphics mode.
Although the GeForces do particularly well in Far Cry 2, the CrossFire system does slightly better than the SLI, which allows the Radeons to position themselves better. Strangely, the Radeon HD 6950 1 GB cards give significantly higher performance than the Radeon HD 6950 2 GBs.
For reasons unknown to us, the Radeons are greatly limited by the CPU in Far Cry 2 in surround gaming, which gives the GeForces an advantage. The CrossFire solutions however have a framerate of over 60 fps and this is more than sufficient to enjoy Far Cry 2 without any problem.
H.A.W.X. is a flying action game. It uses a graphics motor that supports DirectX 10.1 to optimise results. Among the graphics effects it supports, note the presence of ambient occlusion thatís pushed to a max along with all other options. We use the built-in benchmark and patch 1.2 was installed.
The GeForces are more efficient here and the GTX 560 Ti pairing is on a par with the Radeon HD 6990.
All the solutions tested here allow you to enjoy surround gaming in this game, with the exception of the GeForce GTX 580 of course, as it doesnít support this mode. Note that at this resolution, H.A.W.X doesnít offer antialiasing beyond 2x.
The first game with DirectX 11, or more precisely Direct3D 11 support, we couldnít not test BattleForge. An update added in September 2009 gave support for Microsoftís new API.
Compute Shaders 5.0 are used by the developers to accellerate SSAO processing (ambient occlusion). Compared to standard implementation, via the Pixel Shaders, this technique allows more efficient use of the available processing power by saturating the texturing units less. BattleForge offers two SSAO levels: High and Very High. Only the second, called HDAO (High Definition AO), uses Compute Shaders 5.0.
We used the gameís bench and installed the latest available update (1.2 build 304941).
The GeForce GTX 560 Tis finish here on a level with the Radeon HD 6990, which only has a lead at 2560x1600 AA4x. Activation of antialiasing 8x has a bigger impact on the AMD cards here.
Activation of antialiasing 4x has less of an impact on the GeForces.
Pretty successful visually, Civilization V uses DirectX 11 to improve quality and optimise performance in the rendering of terrains thanks to tessellation and in implementing a special compression of textures thanks to the compute shaders. This compression allows it to retain the scenes of all leaders in the memory. This second usage of DirectX 11 doesnít concern us here however as we used the benchmark integrated on a game card. We zoom in slightly so as to reduce the CPU limitation which has a strong impact in this game.
All settings were pushed to a max and we measured performance with shadows and reflections. Patch 1.2 was installed.
The GeForces do very well here and CPU limitation doesnít kick in as quickly as for the Radeons.
Although Civilization V supports surround gaming, unfortunately its built-in benchmark tool doesnít.
S.T.A.L.K.E.R. Call of Pripyat
S.T.A.L.K.E.R. Call of Pripyat
This new S.T.A.L.K.E.R. suite is based on a new development of the graphics engine which moves up to version 01.06.02 and supports Direct3D 11 which is used both to improve performance and quality, with the option to have more detailed light and shade as well as tessellation support.
Maximum quality mode was used and we activated tessellation. The game doesnít support 8x antialiasing. Our test scene is 50% outside and 50% inside and inside itís surrounded with several characters.
The Radeon HD 6900s are very efficient here, especially at 2560x1600.
The Radeons are also very efficient in surround mode, especially with antialiasing where they benefit from their 2 GB memory. The 1 GB solutions are clearly limited by their memory here.
The latest Codemaster title, F1 2010 uses the same engine as DiRT 2 and supports DirectX 11 via patch 1.1 that we installed of course. As this patch was developped in collaboration with AMD, NVIDIA told us that they had only received it late in the day and havenít yet had the opportunity to optimise its drivers for the game in the DirectX 11 version.
We pushed all the graphics options to a max and we used the gameís own test tool on the Spa-Rancorchamps circuit with a single F1.
In F1 2010, the Radeons are particularly at ease. Note that triple CrossFire X doesnít bring any gain here.
For us, F1 2010 is the game in which surround gaming adds most to the experience, notably improving immersion and giving more of a sense of speed.
Although triple CrossFire X still doesnít bring any gain in this game, all the Radeon 2 GB bi-GPU solutions allow you to play F1 2010 with antialiasing and in surround gaming mode. Things arenít quite as good for the GeForce GTX 580 in SLI but it nevertheless runs with good fluidity. With the GeForce GTX 570s in SLI, you have to forget antialiasing however.
Probably the most demanding title right now, Metro 2033 forces all recent graphics cards to their knees. It supports GPU PhysX but only for the generation of particles during impacts, a rather discreet effect that we therefore didnít activate during the tests. In DirectX 11 mode, performance is identical to DirectX 10 mode but with two additional options: tessellation for characters and a very advanced, very demanding depth of field feature.
We tested it in DirectX 11 mode, at a very high quality level and with tessellation activated, with 4x MSAA. Next we measured performance with MSAA 4x and Depth of Field.
The cards equipped with just 1 GB of memory canít handle the load when Depth of Field is activated.
The very heavy Metro 2033 has all the graphics solutions on their knees in surround gaming mode. You need 3 GPUs to enjoy good quality here, each with 2 GB of memory if youíre going to activate antialiasing.
Performance recapAlthough individual game results are worth looking at, especially as high-end solutions are susceptible to being levelled by CPU limitation in some games, we have calculated a performance index based on all tests with the same weight for each game. Mafia II is included with the scores obtained without GPU PhysX effects.
We attributed an index of 100 to the Radeon HD 580 at 1920x1200 with AA4x:
Hold the mouse over the graph to classify the cards by performance at1920 AA4x.
The Radeon HD 6990 does the job in terms of performance with a gain of 22 to 32% over the Radeon HD 5970 in 375W mode and between 46 and 57% over the GeForce GTX 580.
450W mode gives it a gain of 2.9 to 3.4% over the 375W mode, depending on resolution, with the lowest gain being with MSAA 8x. We suppose that memory bandwidth limitations here are in part limiting it, which would be why the gains are lower than the 6% difference in clock between the two biosí.
In comparison to the CrossFire systems, the Radeon HD 6990ís 375W mode puts it on a par with the Radeon HD 6950 2 GB cards, while the 450W mode places it mid-way between these cards and the Radeon HD 6970s.
The Radeon HD 6990 is also at a similar level to the GeForce GTX 570s in SLI, with the GeForce GTX 580s in SLI having a small advantage. The Radeon HD 6990 + Radeon HD 6970 pairing is on sale at a similar price to the GTX 580s in SLI but gives a better level of performance.
Radeon HD 6990 vs CrossFire X
Radeon HD 6990 vs CrossFire XWeíve represented our results so as to show the difference in performance between the Radeon HD 6990 and the Radeon HD 6950 and 6970 in CrossFire X:
Hold the mouse over the graph to display relative performance.
Hold the mouse over the graph to show relative performance.
Hold the mouse over the graph to show relative performance.
Radeon HD 6950 CFX: 2 GB vs 1 GB
Radeon HD 6950 CFX: 2 GB vs 1 GBWe also wanted to show the difference in performance between the CrossFire systems based on the Radeon HD 6950 2 GB and 1 GB cards:
Having 2 GB of memory gives a big gain in Crysis Warhead as well as Metro 2033. Beyond 1920x1200 with FSAA 4x, itís impossible to do any gaming on the 1 GB system in these titles. Note that in the case of Metro 2033, we werenít running at 1920x1200 AA8x but rather 1920x1200 AA 4x + DoF.
Outside of these two cases, the performance of the CrossFire X system based on the Radeon HD 6950 2 GB cards is generally slightly lower than that of the Radeon HD 6950 1 GB cards, a phenomenon we also observed with the single card systems. The difference is however astonishingly high in Far Cry 2.
In surround gaming mode, using 2 GB takes on more importance and completely changes things once antialiasing is activated.
ConclusionAlthough the Radeon HD 6990 does indeed break new records, this isnít always a positive thing unfortunately. While obviously part of the line of previous generation bi-GPU Radeons, the high energy and noise issues on previous generation cards are amplified when it comes to the HD 6990. If you want to sufficiently dissipate 375 to 450 watts, how else are you going to do it than by running your fan loud and fast?
If this is going to trouble you, itís best to admit it right from the start Ė youíll be better off with a couple of GeForce GTX 570s in SLI, a high end bi-GPU solution but with more acceptable noise levels, though with similar energy consumption.
With PowerTune, we thought that AMD would be giving us a rather tame Radeon HD 6990, but they had other ideas: they wanted to develop a power hungry monster and take the thermal envelope way over 300 watts to 375 watts and even 450 watts! One of the particularities of this model is that it has two modes, each with different clocks and energy consumption limits.
Using the 450 watt mode, although on the one hand recommended by AMD, should however be considered as killing the guarantee. It has above all been designed with a view to offering a margin for overclocking and limiting the GPU less during very heavy loads. In practice, in games, it only brings a small gain in comparison to the 375 watt mode. We therefore advise you to use this second bios, especially as you ought to be able to keep noise levels down if you take care to cool your casing properly.
On sale at Ä600, this card is obviously aimed at a limited market, especially as the Radeon HD 6950 and 6970 in CrossFire offer a similar price/performance ratio, a little less noise and a different shape/format which will be easier to adapt to a higher number of boxes. The Radeon HD 6990 will be useful above all for setting up an extreme system without having to monopolise all the slots on your motherboard or requiring an XL-ATX model, by adding a second Radeon HD 6990 or just a Radeon HD 6970. The graphics power youíll then have at your disposal will allow you to benefit fully from surround gaming.
For AMD, the main thing about this card is that it allows it to hold onto the no.1 spot on the podium in terms of performance. While it seems clear that the forthcoming GeForce GTX 590 would have had no trouble facing down the Radeon HD 5970, the Radeon HD 6990 is another proposition altogether and it looks as if a hotly contested duel is on the way!
To finish up, a word on the drivers. A few small performance issues with the latest drivers remind us that they play a primordial role in the correct functioning of any multi-GPU system. Almost 6 years after the advent of these systems, we would have liked them to feature less prominently in our analysis. AMD has taken a step in the right direction by allowing a simple update of the application profiles and a second step will be the implementation of an automatic update, which shouldnít be long in coming. All theyíll then have to do is avoid reintroducing a whole lot of bugs where everything was already working correctly!
Copyright © 1997-2013 BeHardware. All rights reserved.