NVIDIADatacenterHopper
NVIDIA H100 NVL for local AI
NVIDIA H100 NVL provides 94 GB of VRAM for local AI. In the LocalIA catalog, 227 out of 242 models run comfortably on a single card.
VRAM
94GB
Category
Datacenter
Series
Hopper
Vendor
NVIDIA
Models that run comfortably
227 modelsThese models fit in 94 GB with room for context and stable inference.
Tight models
1 modelsThese models barely fit. They can run, but context and speed will be limited.
Unlocked in a 2x rig
188 GBWith two cards in parallel (188 GB total), larger models become reachable.
Unlocked in a 4x rig
376 GBServer-style configuration (376 GB total) for the largest open-weight models.
Similar GPUs
VRAM estimates updated 2026-05-12.