200+ Aflops: NVIDIA Grace Hopper superchip will form the basis of more than 40 AI supercomputers

Новости о мире нейросетей и ИИ
NVIDIA announced that its GH200 Grace Hopper superchip will form the basis of more than 40 AI supercomputers around the world, which are used in research centers, on cloud sites, etc. It is noted that dozens of new HPC systems will soon become available based on GH200. This superchip can solve the most complex scientific problems using AI, which require processing terabytes of data.

Collectively, the GH200-based computing systems will reportedly deliver approximately 200 EFLOPS of AI performance. Specifically, HPE announced that it will integrate the GH200 into HPE Cray supercomputers. EX254n nodes feature two Quad GH200 modules with four superchips each, providing the ability to scale to tens of thousands of nodes. A similar approach is used in the Eviden BullSequana XH3000 platform, which the Jülich Research Center (FZJ) in Germany will receive as part of Jupiter, Europe’s first exascale supercomputer.

The Japan Joint Center for Advanced High Performance Computing (JCAHPC) intends to use the superchip in its next generation supercomputer. The Texas Center for Advanced Computing at the University of Texas at Austin (USA) is equipping the HPC Vista system with superchips. The National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign will use GH200 solutions as part of the DeltaAI AI platform. And Britain will receive an AI supercomputer Isambard-AI based on this superchip, which will be located at the University of Bristol.

All of these systems join previously announced GH200-based platforms from the Swiss National Supercomputing Center (CSCS) and SoftBank Corp. GH200 is already available from select cloud providers such as Lambda and Vultr. CoreWeave announced plans to open GH200 instances in Q1 2024. Other system manufacturers such as ASRock Rack, ASUS, Gigabyte and Ingrasys will begin shipping servers with these superchips by the end of the year.