[ ABORT TO HUD ]
SEQ. 1
SEQ. 2

Mistral Model Family

🌬️ The Mistral Ecosystem10 min100 BASE XP

European AI Sovereignty

Mistral AI positions itself as the European leader in open-weight AI, with nearly all models released under the permissive Apache 2.0 license.

The Full Lineup (April 2026)

ModelParamsArchitectureSpecialty
Mistral Large 3675B (41B active)Sparse MoEFlagship general-purpose, 256K context
Codestral 222B denseDenseCode generation & agentic coding
Devstral 2DenseFrontier agentic dev workflows
Pixtral LargeVLMVision-language, multimodal
Mistral Small 4~14BHybridUnified instruct+reasoning+coding
Ministral 3B/8B/14B3-14BDenseEdge devices, cost-efficient
Magistral Small24BDenseReasoning-focused (open Apache 2.0)
💡 Key Advantage: Unlike Meta's Llama, Mistral's models have no user-count restrictions. Apache 2.0 means fully unrestricted commercial use for companies of any size.

Running Mistral Models

# Via Ollama
ollama run mistral-large

# Via llama.cpp (GGUF)
llama-server -m mistral-large-3-Q4_K_M.gguf --ctx-size 32768

# Via vLLM (production)
vllm serve mistralai/Mistral-Large-3 --tensor-parallel-size 4
KNOWLEDGE CHECK
QUERY 1 // 2
What license does Mistral Large 3 use?
Llama Community License
Research Only
Apache 2.0
GPL v3
Watch: 139x Rust Speedup
Mistral Model Family | The Mistral Ecosystem — Open Source AI Academy