Anticipation is the best joy – at least that’s what they say. Even though Intel’s DG1 was designed and advertised as an entry-level card from the beginning, a minimum of features and functionality can of course be expected from such a midget. A lot of things scale up nicely as long as the test candidates are good to test. Through a good acquaintance, I had the exclusive opportunity to analyze one of the few complete systems, at least on call and from a distance, to dare the teardown and try benchmarks. Torx works.
Important preliminary remark
Since the pictures created with the smartphone didn’t look quite as smart as one would expect from my website, I first reworked all of them and especially straightened the edges a bit. Just in case anyone was wondering. The card really exists and also the pictures are all originals. The tests are fresh, i.e. from yesterday, which is also shown by the date in the screenshots.
With the DG1 (Desktop Graphics 1), Intel initially wants to serve the mass-compatible entry-level sector, which can be used to gain valuable experience. The whole thing is then called Iris Xe MAX (notebook) or simply Iris Xe (desktop). Since this card will not be sold individually, but is intended for SI (system integrators) and OEMs, certain limits are also pre-programmed for the compatible hardware and even so wanted. That’s exactly why I wrote “total work of art” above, because the tested card was delivered (and tested) in a complete OEM PC.
This is also where the interested tester’s odyssey begins, because this DG1 SDV (Software Development Vehicle) only runs on special motherboards with graphics output and with a few Intel CPUs selected for it, which all have an iGP and is not suitable for overclocking. Of course, this also prevents individual pre-distributed DG1 SDV cards from being tested outside of the intended Intel microcosm. I don’t want to spoil anything yet, but even that doesn’t save this DG1.
Technical data, temperatures and power consumption
Interestingly, the original DG1 SDV from Intel is the full mobile version with 90 EU (Execution Units) and it also reports itself dutifully in each of the programs as Intel Iris Xe MAX. The DirectX-12 chip is manufactured in Intel’s own 10 nm Super-Fin process, which is quite competitive, because the numbers of the nonenclature do not say much at first. In the card tested here, the 96 EUs thus result in 768 FP32 ALUs (“shaders”). That’s even a bit more than they’ve budgeted for the Iris Xe “trimmed” to 80 EU (640 ALUs), which will be built by vendors like Asus.
A total of 8 GB LPDDR4 SDRAM with 2133 MHz are connected to the 128-bit memory interface and at least PCIe 4.0 is supported, even if it is only connected on 8 lanes. This also underlines once again the origin from the mobile sector. The OEM cards of the board partners will probably even have to make do with only 4 GB. Let’s take a look at the data from idle mode. At a low 600 MHz clock speed, the power consumption of the chip (not the whole card!) is just under 4 watts and the 850 rpm of the single fan is enough for 30 °C GPU temperature at idle.
Using a simple GPGPU application, 100% GPU load could be generated. However, the usual stress test programs are on strike due to the driver (I’ll get to that in a moment). The clock rate is now 1550 MHz, more could never be reached even at partial loads. This means that the real DG1 is still 100 MHz below what was assumed in advance for the mobile range. It remains to be assumed that the offshoots from Asus or Colorful won’t clock significantly higher either. Plenty of 20 watts for the GPU only, should then result in around 27 to 30 watts for the entire card, because the fan also hangs on the electrical umbilical cord with its up to 1800 rpm. 50 °C shows a rather aggressive fan curve and the suspicion that the card could get too hot after all.
The memory temperatures can’t be read out directly, but are at least monitored internally, because the four 2 GB modules from Micron aren’t really cooled. Which brings us to the teardown, so to speak. I have to pixelate the test system everywhere for source protection reasons, of course, because Intel will hardly be pleased if someone feeds third parties like me with information. The clues give away the culprit and so it will have to suffice for me to mention that it was a Core i7 non-K model on a Z390 mini-ITX board.
37 Antworten
Kommentar
Lade neue Kommentare
Veteran
Mitglied
Urgestein
Mitglied
Urgestein
Veteran
Veteran
Urgestein
Urgestein
Mitglied
Mitglied
Moderator
Mitglied
Mitglied
Urgestein
Urgestein
Urgestein
Veteran
Urgestein
Alle Kommentare lesen unter igor´sLAB Community →