GPUs Graphics Reviews

AMD Radeon Vega Frontier Edition review: Hiking between the worlds

This means that it is already in the approximate, where AMD actually wants to go. While Nvidia has to be careful not to cannibalize its own sister card, the Quadro P6000, AMD does not (currently) have an in-house sister, which can be used with it. With Vega, AMD is now offering a new generation of GPUs that will have received over 200 changes and improvements in the redesign of the architecture. Even if in the end it probably boils down to a kind of new GCN generation: AMD emphasizes that the... Disassembly and radiator details Removing the upper hee cover requires some suitable tools. With a small Torx screwdriver (T5), the six small swivels that hold this cover can be turned out.... Board layout AMD has definitely thought a little bit about the division of the board, especially since the elimination of the external memory modules opens up new possibilities. Exactly in their place you now place the individual power supplies. We... Foreword to the application benchmarks Why we use the Quadro P6000 as a counterpart and not the Titan XP or GeForce GTX 1080 Ti certainly has several reasons, which we have already partially mentioned on the first page. In addition, there is nat... Cheat as you cheat can? It's easy to explain why we've changed our benchmark selection slightly compared to the recently released CPU tests. Since we have to compare several graphics cards from different manufacturers, it falls... Gaming with a "Prosumer" card? Yes, but... AMD itself says that the Radeon Vega Frontier Edition is not an explicitly gaming graphics card, but you can still do so with it. Another problem arises ... Even with DrectX12, we probably can't expect any miracles after the results we have just seen. Should a driver bang give a real boost, then the difference between DirectX11 and Directx12 performance could still be ... DirectX12 and Doom in the window The game Volcano vs. OpenGL 4.5 has been interesting for a long time when it comes to testing Doom. Annoyingly, the Creators Update of Windows again presented us with problems when it comes to the perfor... Power consumption at a glance We measure a value of 14 watts for the card in the idle, which is so okay in view of the scope of performance, even if we had hoped for a little less. But you can really live with that. For the multi-monitor... Temperature curve and clock rate The fan control is quite conservative, so that the maximum temperature of 84°C (short-term also up to 85°C) is reached relatively quickly. But then the card already has approx. 10% of their performance from the cold... Summary There was once a film called "The Great Bluff" - a classic in which you didn't really know who died in whose arms and who gets whom in the end. So either AMD has all enjoyed over a year on the nose ring...

Temperature curve and clock rate

The fan control is quite conservative, so that the maximum temperature of 84°C (short-term also up to 85°C) is reached relatively quickly. But then the card already has approx. 10% of their performance is lost from the cold state, which is almost exclusively done by an automatic reduction in the clock rate. We calculated an average clock frequency in 5-degree increments, which resulted from all the ups and downs of the respective run. This ranged from 1440 MHz in a cool state to 1269 MHz in the worst and hottest case.

We can see on the following graph that clock and performance scale quite well, because if you take the value of 1269 MHz as a starting point, you still reach approx. with 12% clock increase to 1440 MHz. 10% more gaming performance.

Temperature curve and power consumption

Now it's getting really interesting. We measure an average of 266 watts at temperature-related 1269 MHz, and at 1440 MHz we measure just under 300 watts. For 12% more clock, we therefore need approx. 13% more supplied power and receive approx. 10% more gaming performance. So the deal is quite fair and shows that theoretically with such a good scaling still a lot of potential remains upwards, even if a board power of well above the 300 watt mark already seems somewhat questionable.

But what can also be said is that leakage losses obviously no longer play a major role. Times when you could easily save around 40 watts or more at the same clock, if the temperature remains low enough, should be a thing of the past. At least something.

Temperature history GPU vs. HBM2 Memory Modules

As far as the read-out values are correct, we can assume a maximum of 84 (85°C peak) for the GPU and a maximum of 95°C (96°C Watt Peak) for the HBM2 modules. The latter appears relatively high, but is also considered the upper limit for the current GDDRX5 memory. Of course, we will always keep an eye on these values during further tests, because we cannot currently confirm the accuracy of the sensor interpretation 100 percent.

Temperature gradients on the board ("heat flux")

What we can immediately determine: the board below the base is approx. 5°C cooler than the values we were able to determine within the GPU! But what is the reason for this? The explanation was already on page two, because we were able to find a very thick package board, which is still located between the interposer and the PCB. In addition, the interposer is obviously not fully on the package ("underfill issue"), so that the air in between almost acts like an insulation layer.

The following video shows this warm-up once again in a time-lapse, because we let the measured 20 minutes simply run faster by a factor of 10:

 

In the stress test, the temperature is slightly lower, which is probably also due to the increased fan activity and the regulation by the lowered clock (see power consumption).

Noise emission ("volume")

To be brief, it is the best reference cooler AMD has launched since the Radeon HD 2900. The significantly improved radial fan, which this time comes from Delta, has ensured that you definitely reach the level of the GeForce GTX 1080 Founders Edition. The measured 44 dB(A) is the maximum value and it is thus almost possible to cover the coil noise generated at very high FPS values very broadband. The video illustrates this quite well, especially at the end of the clip:

 

This statement and the subjective feeling also coincide with the frequency analysis based on the following spectrum. This time we don't see snarling, low-frequency noises, we're being generated by bad bearings and low-priced engines. The so-called coil fiep can be heard this time in the range around the eight KHz, which you can see very well on the graphic below:

However, the majority is accounted for by the fan and that is really acceptable. A GeForce GTX 1080 Ti with approx. 250 watts of waste heat to be dissipated is not significantly quieter, as it has much less to perform. If you hoist them to a comparable 270 watts by means of a higher power target, the noise emission is roughly the same. From this point of view, one can already speak of a kind of tie, especially since both manufacturers have probably reached the limits of what is physically feasible. The cooler should be 35 USD also hardly come cheaper, more brings hardly any real added value for such a thing economically.

Danke für die Spende



Du fandest, der Beitrag war interessant und möchtest uns unterstützen? Klasse!

Hier erfährst Du, wie: Hier spenden.

Hier kannst Du per PayPal spenden.

About the author

Igor Wallossek

Editor-in-chief and name-giver of igor'sLAB as the content successor of Tom's Hardware Germany, whose license was returned in June 2019 in order to better meet the qualitative demands of web content and challenges of new media such as YouTube with its own channel.

Computer nerd since 1983, audio freak since 1979 and pretty much open to anything with a plug or battery for over 50 years.

Follow Igor:
YouTube Facebook Instagram Twitter

Werbung

Werbung