As the owner of an AMD RX Vega graphics card, it is not always easy. Often the graphics card was called a “flop”, its efficiency was criticized and in general it was simply not a real “gaming graphics card”. Whoever defends such a thing can only be a fanboy! Right? I had already shown in another article that as an interested tinkerer you can correct a lot in terms of efficiency. Also the benchmarks don’t lie (most of the time) and as you can see from various retests and current game reviews, the performance has kept in balance with the corresponding Pascal counterparts from Nvidia, which no objective tester would call a “gaming cripple” after all.
Why do I go so far and “defend” an AMD graphics card in the introduction to a Geforce RTX review? Very simple: I want to make it clear from the start that I don’t explicitly prefer any manufacturer and that I always bought the product that offered me the best price-performance ratio within my set budget at the time of purchase. Yes, I had (clearly) more Geforce than Radeon cards and no, I don’t really like Nvidia’s price policy. For the latter reason I was often enough annoyed about the prices of Nvidia’s Turing graphics cards and in regard to raytracing performance I also liked to shoot against the “RTX on” slogan.
After all, the first implementations of DLSS were an insult to the eyes and the raytracing effects cost so much power that even the wickedly expensive 2080Ti was often brought to its knees in FullHD. But lately there is more and more praise for the Turing features and so I wanted to have a look at it myself. The aim of this review is to examine some of the Turing features in detail and then to evaluate them neutrally and objectively. Of course, I didn’t miss a small comparison between the 2060 Super and the Vega 64, which should only be seen as a small bonus – the cards were (at the time of purchase) at a quite similar price level and I’m simply interested in whether a slightly optimized custom Vega does not perform a bit better than the reference models often used for tests.
DLSS – Gamechanger or just mud?
I was so interested in DLSS that I wrote a separate article about it. Basically you can say that the first iteration was rather sobering or even disappointing, as you can easily see in the following slider:
DLSS 2.0, on the other hand, does a lot of things right and, despite a noticeable increase in FPS, even brings an increase in picture quality in some cases:
I will write down a final evaluation in my conclusion. By the way, Igor recently measured that you can even save electricity by using DLSS.