AMDDatacenterInstinct CDNA 3+
Instinct MI300X for local AI
Instinct MI300X provides 192 GB of VRAM for local AI. In the LocalIA catalog, 233 out of 242 models run comfortably on a single card.
VRAM
192GB
Category
Datacenter
Series
Instinct CDNA 3+
Vendor
AMD
Models that run comfortably
233 modelsThese models fit in 192 GB with room for context and stable inference.
Unlocked in a 2x rig
384 GBWith two cards in parallel (384 GB total), larger models become reachable.
Unlocked in a 4x rig
768 GBServer-style configuration (768 GB total) for the largest open-weight models.
Similar GPUs
VRAM estimates updated 2026-05-12.