DeepSeek V3 & R1

Models Local

Description: Chinese open-source MoE models with excellent reasoning performance

Website: https://deepseeksr1.com

DeepSeek has released two impressive open-source models under MIT license with V3 and R1 that compete with leading proprietary models.

DeepSeek-V3

Available in V3.1 (Dual-Mode Thinking) and V3.2-Exp (DeepSeek Sparse Attention).

DeepSeek-R1

Specialized reasoning model trained with reinforcement learning:

Availability

MIT license for self-hosting. Available on Hugging Face and GitHub. Web interface, mobile apps and developer API.

Highlight

Comparable performance to proprietary top models with full openness.