According to DigiTimes, it is reported that Nvidia is interested in evaluating SK Hynix’s HBM3E samples, as indicated by “industry insiders.” If this information is accurate, the next generation of Nvidia GPUs for artificial intelligence and high-performance computing applications could potentially use HBM3E memory instead of HBM3.

According to industry insiders quoted by Korean media outlets such as Korea’s Money Today and Seoul Economic Daily, NVIDIA has requested samples of HBM3E memory from SK Hynix. The goal is to evaluate the impact of this type of memory on graphics processing unit (GPU) performance. SK Hynix’s upcoming HBM3E memory will increase the current data transfer rate from 6.40 GT/s to 8.0 GT/s. This improvement will increase the bandwidth per memory stack from 819.2 GB/s to 1 TB/s. However, there is still uncertainty about the compatibility of HBM3E memory with existing HBM3 controllers and interfaces, as SK Hynix has not yet released any information about this aspect of the new technology. In any case, NVIDIA and other AI and HPC GPU developers will need to evaluate the technology.
It appears that SK Hynix plans to begin sampling its HBM3E memory in the second half of 2023 and begin mass production in late 2023 or 2024. The company intends to manufacture the HBM3E memory using its 1b-nanometer manufacturing process, which is also used to make DDR5-6400 DRAMs and LPDDR5T memory chips. It remains to be seen which compute GPUs from NVIDIA will use the HBM3E memory. However, it is likely that the company will use the new memory type for the next generation of processors, which are expected to hit the market in 2024.
However, it is unclear whether this will be a revamped Hopper GH100 Compute GPU or something completely new. SK Hynix is currently the dominant player in the HBM memory market, controlling more than 50% of the market. They are also the exclusive manufacturer of HBM3, at least in the early stages. Market research firm Yole Development forecasts significant growth in the HBM memory market due to its unique bandwidth advantage over other DRAM types. The firm estimates that the market will nearly double from $705 million in 2023 to $1.324 billion by 2027.
Source: TomsHardware
2 Antworten
Kommentar
Lade neue Kommentare
Urgestein
Mitglied
Alle Kommentare lesen unter igor´sLAB Community →