NVIDIADatacenterHopper
NVIDIA H200 141GB for local AI
NVIDIA H200 141GB provides 141 GB of VRAM for local AI. In the LocalIA catalog, 229 out of 242 models run comfortably on a single card.
VRAM
141GB
Category
Datacenter
Series
Hopper
Vendor
NVIDIA
Models that run comfortably
229 modelsThese models fit in 141 GB with room for context and stable inference.
Unlocked in a 2x rig
282 GBWith two cards in parallel (282 GB total), larger models become reachable.
Unlocked in a 4x rig
564 GBServer-style configuration (564 GB total) for the largest open-weight models.
Similar GPUs
VRAM estimates updated 2026-05-12.