Artificial Intelligence GPUs Latest news

NVIDIA H100 NVL graphics processor with 94 GB HBM3 memory – Exclusively developed for ChatGPT | GTC

The NVIDIA H100 NVL GPU represents a high-performance graphics card designed specifically for demanding machine learning and artificial intelligence applications. This high-performance graphics processing unit was designed to efficiently handle large amounts of data while performing complex calculations. The NVIDIA Hopper GPU-based H100 NVL PCIe graphics card is indeed an impressive graphics processing solution. It features dual-GPU NVLINK connectivity, which enables fast communication between GPUs, and 94 GB of HBM3e memory per chip, which enables the graphics card to process large amounts of data quickly.

Particularly noteworthy is the GPU’s ability to process up to 175 billion ChatGPT parameters on the fly. This impressive performance shows that the NVIDIA H100 NVL GPU is perfectly suited for computationally intensive applications such as artificial intelligence and machine learning, making it ideal for demanding applications.

Compared to the H100 SXM5 configuration, the H100 PCIe configuration has lower specifications. While the GH100 GPU can activate 144 streaming multiprocessors (SMs), only 114 SMs are active in the H100 PCIe configuration. In contrast, the H100 SXM configuration has 132 SMs. Despite this reduction, the H100 PCIe configuration still remains powerful and offers impressive computing capacity.

The H100 PCIe configuration is capable of performing 3200 floating-point-8 (FP8) operations per second, 1600 tensor-16 (TF16) operations per second, and 48 TeraFLOPs (TFLOPs) double-precision floating-point (FP64) operations per second. In addition, the GPU is equipped with 456 tensor and texture units that can be used to handle complex artificial intelligence and machine learning computations.

Although the number of SMs is lower compared to the GH100 GPU and the H100 SXM configuration, the H100 PCIe configuration is still capable of performing sophisticated computations in various application fields such as science, artificial intelligence, and machine learning. It is a good choice for applications that require fewer SMs or rely on a PCIe configuration due to space constraints.

The NVIDIA H100 NVL GPU was designed specifically to meet the needs of ChatGPT, a large-scale language model training algorithm developed by OpenAI. ChatGPT is designed to have human-like conversations based on text input and requires an enormous amount of processing power to function efficiently and quickly. Using the NVIDIA H100 NVL GPU, ChatGPT can process large amounts of data in real time and respond faster, contributing to improved performance and a streamlined user experience.

The H100 PCIe represents a version of the NVIDIA A100 GPU, which is designed as a PCIe card. Compared to the more powerful SXM5 version, the H100 PCIe has a lower peak computing performance and therefore operates with lower clock frequencies. This card has a maximum thermal design power (TDP) of 350 watts, while the SXM5 version shows a double TDP of 700 watts. Nevertheless, the H100 PCIe retains its 80 GB storage capacity, which has a 5120-bit bus interface.

The memory of the H100 PCIe is realized by HBM2e memory, which enables a bandwidth of over 2 TB/s. HBM2e (High Bandwidth Memory 2e) is an enhanced version of HBM2 memory and offers higher data transfer rates and better energy efficiency. The H100 PCIe is particularly suitable for applications that require less computing power than those based on the SXM5 version. Nevertheless, it can handle large data sets because it has an extensive memory capacity and high memory bandwidth.

The H100 PCIe can be used in servers and workstations specialized in artificial intelligence, machine learning, deep learning or HPC applications. Overall, the H100 PCIe is a powerful GPU variant that offers large memory capacity and high memory bandwidth, while having a lower TDP than the SXM5 version.

Source: NVIDIA

Kommentar

Lade neue Kommentare

Derfnam

Urgestein

7,517 Kommentare 2,029 Likes

Ist das sicher mit der Quelle? Das liest sich schon sehr kritisch und objektiv, die Zweifel an dem Produkt sind ja enorm und das kommt auch so rüber.

Antwort Gefällt mir

d
diemelbecker

Mitglied

29 Kommentare 1 Likes

Cool, der 12V-Stecker ist an der Rückseite:D
Dort schaffen sie das also.

Antwort Gefällt mir

c
cunhell

Urgestein

545 Kommentare 499 Likes

Naja, so wie es für mich aussieht passen die Karten in Standard-2HE-Server. Da hast Du oben keinen Platz. Da muss man die Anschlüsse hinten anbringen. Form follows Function halt. Nicht wie bei den PCs Function follows Form ;-)

Cunhell

Antwort Gefällt mir

Gregor Kacknoob

Urgestein

524 Kommentare 442 Likes

Function follows RGB *hust*

Antwort Gefällt mir

a
anfreund

Mitglied

30 Kommentare 4 Likes

Bei den Zahlen und Einheiten im Artikel hat wohl der Fehlerteufel zugeschlagen. Da fehlt glg. Tera als Vorsilbe. Sonst wäre es ein ziemlich lahmes Ding. Und 5120 Bit als Speicherbandbreite?

Antwort Gefällt mir

RawMangoJuli

Veteran

253 Kommentare 146 Likes

Die 5120Bit sind schon richtig ... 5x 1024Bit

Antwort Gefällt mir

D
Deridex

Urgestein

2,204 Kommentare 843 Likes

@Derfnam
Ich glaube du hast das Sarkasmus Schild vergessen ;)

Antwort Gefällt mir

LurkingInShadows

Urgestein

1,345 Kommentare 549 Likes

Das ist so riesig, das kann er nimmer in den Post klatschen^^

Antwort Gefällt mir

Derfnam

Urgestein

7,517 Kommentare 2,029 Likes

Zu subtil? Plus: auf imaginären Werbebannern hält leider nix.

Antwort Gefällt mir

G
Gamer

Veteran

147 Kommentare 37 Likes

Und direkt der nächste Fail von Nvidia. Lernen die das nie?

Antwort Gefällt mir

a
anfreund

Mitglied

30 Kommentare 4 Likes

Warum? Wo liegt das Problem bei dieser Karte? Bissel teuer, ja...

Antwort Gefällt mir

LurkingInShadows

Urgestein

1,345 Kommentare 549 Likes

gamers problem ist (wieder mal) dass es nicht von AMD ist.

Antwort Gefällt mir

Danke für die Spende



Du fandest, der Beitrag war interessant und möchtest uns unterstützen? Klasse!

Hier erfährst Du, wie: Hier spenden.

Hier kannst Du per PayPal spenden.

About the author

Samir Bashir

Werbung

Werbung