Summary
The GeForce RTX 4070 Super has made considerable gains, which from NVIDIA’s point of view had a good reason. Once again, we see that competition stimulates business and so Team Green ultimately reacted (had to react) to the Radeon RX 7800XT, which, together with now optimized drivers, came too close to the GeForce RTX 4070 Non-Super or could even easily beat it in higher resolutions and still had an interesting price.
Although NVIDIA has not expanded the memory or increased the memory interface, it has at least significantly increased the L2 cache in order to optimize queries and response times, especially at higher resolutions. Overall, a chip with significantly more streaming multiprocessors (including more Tensor and RT cores) and the larger cache does not achieve the theoretical 20% increase in performance that is possible on paper, but (depending on the load) 15 to even 16% is always possible as long as the CPU is not limiting. That is real progress.
The fact that a few percent was left behind here is also due to the slightly lower clock rate, which is so close to the sweet spot that the RTX 4070 is even a tad more efficient than the non-super variant. With less than 220 watts, the Super is clearly ahead. Of course, NVIDIA also leaves more room for the OC cards in terms of power consumption and it will certainly be possible to close the gap to the theoretical performance, but this will certainly be at the expense of efficiency, although it will make the “old” GeForce RTX 4070 Ti sweat a little. It is therefore only logical that NVIDIA is also upgrading this card and making it faster. After all, the efficiency considerations clearly show us that the efficiency does not decrease at all with the higher power limit and many more cores – on the contrary.
The GeForce RTX 4070 Super is an excellent card in Full HD when it comes to the highest frame rates and is also ideally suited for WQHD. In Ultra HD at the latest, however, you will have to think about smart upscaling in places and this is where DLSS comes into play. Games such as “The Last of Us Part 1” (TLOU) now even look subjectively better in Ultra HD with DLSS than native Ultra HD. This is where NVIDIA can really play to its advantages, which DLSS 2.x and, above all, DLSS 3.5 also offer in purely visual terms. However, if a game also supports frame generation and you would still be bobbing around in the less playable FPS range even with super sampling, then this can even be a lifeline to good playability. You can’t improve the latency with it, but not every genre is as latency-bound as various shooters. I would have really liked DLSS 3.5 for TLOU, but you can’t have everything.
You can get all the advantages of the Ada architecture from 659 euros (MSRP cards) and could be more than satisfied with it in the context of the current price spiral, if it weren’t for the memory expansion and the narrow memory interface, which I had already noted with the GeForce RTX 4070 Ti and consider to be too tight with regard to the future. Yes, at the moment this may still be enough for WQHD and usually also UHD, but games like TLOU unfortunately show us that resources are being used more and more wastefully and the memory could be full faster than you can say pug. We can already see that there is always something. The larger cache only helps to a limited extent. I see a certain need for improvement in the Studio drivers. There is still a lot of untapped potential here, especially in the professional sector. I have already mentioned this, especially with older standard software.
The NVIDIA GeForce RTX 4070 Super FE 12GB
The cooler is still ok in the context of the TBP, but the backplate pads next to the hotspot (and not behind it) really shouldn’t have been there and it’s a bit on the borderline. The board as such is only average, because the cheap voltage converters in particular are somewhat disappointing. Especially as at least one phase (or more, with a load balancer) could easily have been implemented without major layout changes. But savings had to be made somewhere, even if it was only a few DrMOS, coils and bucket capacitors. There’s no need to write anything about the 12VHPWR 12V-2×6 connection, as we’ve already chewed through that enough. At less than 250 watts, nothing is happening.
The fact that NVIDIA has adapted my pad mod for the older cards without comment is quite remarkable. The relocation of the primary shunt to the back of the board, including cooling, is also charming if you read through my tests on this again. On the other hand, it is rather amusing that this has been implemented on a rather low-performance card. But the card is well suited as a practice project. Visually and haptically, the Founders Edition is, as always, a cream and collector’s item, but this time in black. But even so, the FE would certainly have been on my shortlist if it was to be such a WQHD card.
Conclusion
The GeForce RTX 4070 Super with the AD104-350 is a highly interesting mid-range card that no longer has to fear a direct competitor from AMD in this super generation until Team Red brings a slimmed-down and attractively priced RX 7900 Non-XT to the German market or pumps the RX 7900 GRE into the normal channel and not just supplies system integrators. In terms of efficiency, NVIDIA is once again setting standards by which AMD must (but currently cannot) be measured. Whether and when the RX 7900 without XT or a GRE for everyone will come is still written in the stars. But gamers live in the here and now and there are simply no alternatives at the moment if you want the complete feature set including high-quality super sampling, frame generation and AI.
Apart from the outdated Display Port connection and the still somewhat meagre 12 GB memory expansion for Ultra HD, I don’t see any disadvantages with the GeForce RTX 4070 Super that would speak against this card. The price is okay so far, if you put it in relation to the performance of the other cards. Because AMD isn’t really any cheaper. The manufacturers will hardly make any big profits with the MSRP cards, at least that much I can tell you. But they won’t starve either. Much of it is little more than a zero-sum game, where it only becomes somewhat profitable through the masses.
The graphics cards were provided by NVIDIA for this test without obligation. The only condition was compliance with the blocking period, there was no influence or compensation.
- 1 - Introduction, technical Data and Features
- 2 - Test System and Equipment
- 3 - Teardown: PCB, Components and Cooler
- 4 - Material Analysis and a Surprise
- 5 - Gaming Performance FHD (1920 x 1080)
- 6 - Gaming-Performance WQHD (2560 x 1440)
- 7 - Gaming Performance Ultra-HD (3840 x 2160)
- 8 - Gaming Performance DLSS vs. FSR
- 9 - Gaming Performance Frame Generation
- 10 - Latencies in Detail
- 11 - Workstation Graphics and Rendering
- 12 - Power Consumption and Load Balancing
- 13 - Transients and PSU Recommendation
- 14 - Temperatures, Clock Rate and Infrared Analysis
- 15 - Fan Curves and Noise
- 16 - Summary and Conclusion
61 Antworten
Kommentar
Lade neue Kommentare
Veteran
Veteran
Veteran
Veteran
1
Veteran
Urgestein
Urgestein
1
Mitglied
Veteran
1
Mitglied
Mitglied
Veteran
Urgestein
Veteran
Urgestein
Alle Kommentare lesen unter igor´sLAB Community →