At Computex 2025 in Taipei, Nvidia CEO Jen-Hsun Huang was not to be outdone and in a joint event with Foxconn boss Liu Yangwei, Huang stated unequivocally: Moore’s Law is over. Instead, he focused on a mixture of 3D chip packaging, NVLINK, liquid cooling and software architecture as the new drivers of the AI industry. And if he has his way, the pace of development in the industry will soon be limited only by the stratosphere.
Moore’s law: a past with an announcement
Huang’s diagnosis is sober: the days when the density of transistors on a chip doubled every two years are over. The physical limits have been reached – at least in the traditional sense. Rising production costs, sluggish energy efficiency gains and stagnating growth in clock frequency are forcing structural changes. The answer from Santa Clara is: new architectures, new systems, new ways of thinking.
Three technologies, one strategy
The first building block of Nvidia’s future model is 3D chip packaging. Instead of monolithic dies, several smaller chips are stacked vertically and combined into complex units. This reduces latencies and increases computing power with comparatively moderate energy requirements – at least on paper. Secondly, Huang refers to the company’s own NVLINK technology. This high-speed connection makes it possible to network individual chips so closely that, from a software perspective, they act as a single, large processor. This brings us closer to a “superchip” concept without the disadvantages of huge dies. The third lever: complete integration at system level. Mechanical components, liquid cooling, software and computing architecture are designed and built as a unit. The goal: a highly efficient, specialized AI server rack, optimized from the hardware to the last software module.
AI architecture: from the GPU to the infrastructure
Huang emphasizes that Nvidia is now much more than just a graphics card manufacturer. In three decades, the company has transformed itself into a platform provider for complete AI infrastructures – including chips, software, data centers and operating systems. This transformation is no coincidence, but the result of a targeted reorientation: away from pure computing power and towards holistic system integration. A central element of this strategy is the optimization of information flows within neural networks. Instead of channeling data via external interfaces, as much communication as possible is now handled internally – directly on the chip or between closely networked components. This reduces latencies and saves energy.
Growth without limits?
Huang’s forecast regarding the speed of development is particularly remarkable: Nvidia’s performance is currently doubling every six months. In the near future, according to Huang, this cycle could shrink to three months. This means that classic Moore’s growth will not only fade into the background, but will be replaced by an exponentially accelerated cycle. Of course, it remains to be seen whether such statements are reliable in the long term. After all, even Nvidia is not free from physical constraints – for example in terms of thermal load, power supply and material costs. Nevertheless, one thing is certain: the company is working hard to prepare for a world in which AI infrastructure forms the central backbone of the digital economy.
Between the sky and the ground: planning with moderation
Despite all his visions, Huang does not lose touch with reality. Data centers cannot be built overnight – they need land, power connections, cooling capacity and a stable supply chain. Nvidia wants to remedy this with detailed infrastructure plans so that companies around the world can make targeted investments. Political and economic factors also play a role. The statements at Computex are made in a tense geopolitical climate in which Taiwan, China and the USA form an increasingly complex field of tension. It is probably no coincidence that Nvidia is outlining its vision of the future here of all places.
The end of a law, the beginning of a system
With the announcement that only the sky is the limit, Huang is finally saying goodbye to Moore’s paradigm – at least rhetorically. In its place is a systems approach that sees computing power as the result of intelligent architecture, efficient integration and software optimization. Whether this strategy will work in the long term remains to be seen. But one thing is clear: Nvidia is positioning itself as a driver of the next development phase – not just with chips, but with an overall package that inextricably links hardware and software. And which – according to Huang – is located somewhere between earth, cloud and cosmos.
Source: Money
2 Antworten
Kommentar
Lade neue Kommentare
Urgestein
Veteran
Alle Kommentare lesen unter igor´sLAB Community →