Mistral 3
Description: French open-source MoE model with multimodal capabilities
Website: https://mistral.ai
Mistral AI released the Mistral 3 family in December 2025 - a comprehensive collection of open-source models under Apache 2.0 license.
Model Variants
Ministral 3 Series:
- Sizes: 3B, 8B, 14B parameters
- Variants: Base, Instruct, Reasoning
- Highlight: Native image understanding, optimized for edge devices
Mistral Large 3:
- Parameters: 675B total (MoE), 41B active
- Ranking: #2 on LMSYS Arena for open-source non-reasoning models
- Variants: Base, Instruct (Reasoning to follow)
Features
- Languages: Over 40 languages
- Multimodal: Text and images
- Math: 14B reasoning model achieves 85% on AIME ‘25
- Deployment: Optimized for vLLM and TensorRT-LLM
Availability
On Hugging Face, AWS Bedrock, Azure, NVIDIA NIM. Enables on-device AI for IoT and mobile apps.
Application
Low-latency, high-throughput scenarios. Full customization through Apache 2.0 license.