CoreWeave Introduces NVIDIA GB200 NVL72 for AI at Scale

Nvidia

AI reasoning models and intelligent agents are changing industries, but they require huge computing power and advanced infrastructure to work efficiently at scale. These models generate a lot of extra data and need high-speed communication, memory, and computing resources to deliver fast, high-quality results.

CoreWeave Brings NVIDIA Blackwell to the Cloud

To meet this growing demand, CoreWeave has introduced cloud instances powered by NVIDIA GB200 NVL72, making it the first cloud provider to offer NVIDIA’s latest Blackwell platform for general use.

These new instances use 72 Blackwell GPUs and 36 Grace CPUs, connected through NVIDIA NVLink and Quantum-2 InfiniBand networking, allowing companies to scale up to 110,000 GPUs. This powerful setup provides the performance needed for advanced AI models and applications.

How does NVIDIA GB200 NVL72 work?

Liquid-cooled, rack-scale system: The 72 GPUs work together as a single massive GPU.

Faster AI processing: The latest NVLink technology delivers 130TB/s of bandwidth, and the Transformer Engine speeds up AI tasks while keeping accuracy high.

Optimized cloud services: CoreWeave’s Kubernetes Service ensures efficient workload management, while Slurm on Kubernetes (SUNK) intelligently distributes tasks across GPUs.

High-speed networking: Quantum-2 InfiniBand provides 400Gb/s per GPU, supporting massive AI clusters.

A Complete AI Computing Platform

NVIDIA’s AI platform combines advanced hardware with powerful software to help businesses build smarter AI agents. Key software tools include:

NVIDIA Blueprints: Pre-built workflows to simplify AI application development.

NVIDIA NIM: Secure and scalable AI model deployment.

NVIDIA NeMo: Tools for training and customizing AI models.

All of these tools can be deployed on CoreWeave’s cloud infrastructure, making it easier for enterprises to build and scale AI-powered applications.

AI at Cloud Scale

With the launch of NVIDIA GB200 NVL72 instances, businesses can now access cutting-edge AI computing power directly in the cloud. These instances are available in the US-WEST-01 region, and companies can start using them through CoreWeave Kubernetes Service.

For those looking to leverage next-generation AI, CoreWeave provides the scale and technology needed to power the future of AI reasoning models and intelligent agents.