NVIDIAWorkstationRTX Pro Blackwell
RTX Pro 6000 Blackwell for local AI
RTX Pro 6000 Blackwell provides 96 GB of VRAM for local AI. In the LocalIA catalog, 227 out of 242 models run comfortably on a single card.
VRAM
96GB
Category
Workstation
Series
RTX Pro Blackwell
Vendor
NVIDIA
Models that run comfortably
227 modelsThese models fit in 96 GB with room for context and stable inference.
Tight models
1 modelsThese models barely fit. They can run, but context and speed will be limited.
Unlocked in a 2x rig
192 GBWith two cards in parallel (192 GB total), larger models become reachable.
Unlocked in a 4x rig
384 GBServer-style configuration (384 GB total) for the largest open-weight models.
Similar GPUs
VRAM estimates updated 2026-05-12.