NVIDIADatacenterHopper
NVIDIA H100 80GB for local AI
NVIDIA H100 80GB provides 80 GB of VRAM for local AI. In the LocalIA catalog, 224 out of 242 models run comfortably on a single card.
VRAM
80GB
Category
Datacenter
Series
Hopper
Vendor
NVIDIA
Models that run comfortably
224 modelsThese models fit in 80 GB with room for context and stable inference.
Tight models
3 modelsThese models barely fit. They can run, but context and speed will be limited.
Unlocked in a 2x rig
160 GBWith two cards in parallel (160 GB total), larger models become reachable.
Unlocked in a 4x rig
320 GBServer-style configuration (320 GB total) for the largest open-weight models.
Similar GPUs
VRAM estimates updated 2026-05-12.