Codestral22B params33k contextpopular

Codestral 22B locally

Codestral 22B is an open-weight LLM from the Codestral family with 22B parameters. Main use: code and developer agents. Detected minimum hardware: RTX 4060 Ti 16GB (16 GB).

Technical facts
Parameters22B
Max context33k
Q4_K_M13.8 GB
Q5_K_M16.9 GB
Q824.6 GB
FP1649.2 GB
FamilyCodestral
Last sync2026-05-12

Available quantizations

Q4_K_M
13.8GB

Acceptable. Good compromise when VRAM is limited.

Q5_K_M
16.9GB

Good quality. Sweet spot for size and precision.

Q8
24.6GB

Near-FP16 quality. Comfortable for production.

FP16
49.2GB

Reference precision. Maximum quality, doubled VRAM.

Compatible GPUs

GPUs that can run Codestral 22B on a single card, ranked by VRAM headroom.

Recommended multi-GPU rigs

For Codestral 22B at higher quantization or with more context, a multi-GPU rig gives more headroom.

Recommended rig

2× GTX 1070

Codestral 22B with Ubuntu, vLLM, Open WebUI and the model already downloaded.

Configure

VRAM estimates: parameters x bits/8 plus margin. Real performance varies by engine, context length and batch size.
sync: 2026-05-12