Available GPUs
Private Cloud clusters are available with the latest NVIDIA data center GPUs, optimized for AI training and inference at scale.GPU Options
NVIDIA H100 SXM
The industry standard for large-scale AI training.| Spec | Value |
|---|---|
| GPU Memory | 80 GB HBM3 |
| Memory Bandwidth | 3.35 TB/s |
| FP16 Performance | 989 TFLOPS |
| Interconnect | NVLink 4.0 (900 GB/s) |
| Best For | LLM training, fine-tuning, high-throughput inference |
NVIDIA H200 SXM
Enhanced H100 with more memory and bandwidth for memory-bound workloads.| Spec | Value |
|---|---|
| GPU Memory | 141 GB HBM3e |
| Memory Bandwidth | 4.8 TB/s |
| FP16 Performance | 989 TFLOPS |
| Interconnect | NVLink 4.0 (900 GB/s) |
| Best For | Large model inference, long-context training, models that need more VRAM |
NVIDIA B200
Next-generation Blackwell architecture with massive compute gains.| Spec | Value |
|---|---|
| GPU Memory | 192 GB HBM3e |
| Memory Bandwidth | 8 TB/s |
| FP16 Performance | 2,250 TFLOPS |
| FP4 Performance | 9,000 TFLOPS |
| Interconnect | NVLink 5.0 (1,800 GB/s) |
| Best For | Frontier model training, next-gen inference, FP4 quantized workloads |
NVIDIA B300
The latest Blackwell Ultra with maximum memory and performance.| Spec | Value |
|---|---|
| GPU Memory | 288 GB HBM3e |
| Memory Bandwidth | 12 TB/s |
| FP16 Performance | 2,250 TFLOPS |
| FP4 Performance | 9,000 TFLOPS |
| Interconnect | NVLink 5.0 (1,800 GB/s) |
| Best For | Largest-scale training, trillion-parameter models, maximum memory capacity |
Choosing a GPU
| GPU | Memory | Best For | Availability |
|---|---|---|---|
| H100 | 80 GB | General AI training, proven and widely supported | High |
| H200 | 141 GB | Memory-hungry models, large batch inference | Moderate |
| B200 | 192 GB | Next-gen training, 2x compute over H100 | Growing |
| B300 | 288 GB | Maximum scale, highest memory capacity | Limited |
Networking
All Private Cloud clusters include high-speed interconnect:| GPU | Intra-Node (NVLink) | Inter-Node (InfiniBand) |
|---|---|---|
| H100 | 900 GB/s | 400 Gb/s NDR |
| H200 | 900 GB/s | 400 Gb/s NDR |
| B200 | 1,800 GB/s | 400 Gb/s NDR |
| B300 | 1,800 GB/s | 400 Gb/s NDR |