DeepSeek R1 1.5B
by DeepSeek · deepseek-r1 family
1.5B
parameters
text-generation reasoning math
DeepSeek R1 1.5B is a distilled reasoning model based on Qwen 2.5, bringing chain-of-thought reasoning to ultra-lightweight hardware. Despite its tiny size, it shows the characteristic "thinking" behavior of the R1 family on math and logic tasks. At under 3 GB VRAM for Q8, it runs on virtually any hardware — great for edge devices, experimenting with reasoning models, or as a fast local assistant.
Quick Start with Ollama
ollama run 1.5b-q8_0 | Creator | DeepSeek |
| Parameters | 1.5B |
| Architecture | transformer-decoder |
| Context | 128K tokens |
| Released | Jan 20, 2025 |
| License | MIT |
| Ollama | deepseek-r1:1.5b |
Quantization Options
| Format | File Size | VRAM Required | Quality | Ollama Tag |
|---|---|---|---|---|
| Q4_K_M | 1.1 GB | 2 GB | | 1.5b-q4_K_M |
| Q8_0 rec | 1.9 GB | 3 GB | | 1.5b-q8_0 |
| F16 | 3.5 GB | 5 GB | | 1.5b-fp16 |
Compatible Hardware
Q8_0 requires 3 GB VRAM
Benchmark Scores
52.0
mmlu