NVIDIADatacenterBlackwell DC
NVIDIA B100 for local AI
NVIDIA B100 provides 192 GB of VRAM for local AI. In the LocalIA catalog, 233 out of 242 models run comfortably on a single card.
VRAM
192GB
Category
Datacenter
Series
Blackwell DC
Vendor
NVIDIA
Models that run comfortably
233 modelsThese models fit in 192 GB with room for context and stable inference.
Unlocked in a 2x rig
384 GBWith two cards in parallel (384 GB total), larger models become reachable.
Unlocked in a 4x rig
768 GBServer-style configuration (768 GB total) for the largest open-weight models.
Similar GPUs
VRAM estimates updated 2026-05-12.