Phi-3 Mini 3.8B
by Microsoft · phi family
3.8B
parameters
text-generation code-generation reasoning math summarization
Phi-3 Mini 3.8B is Microsoft's compact yet powerful language model trained on high-quality synthetic and curated data. It delivers performance that rivals models twice its size, especially in reasoning, math, and coding benchmarks. With support for up to 128K context and a permissive MIT license, Phi-3 Mini is well-suited for on-device and edge deployment. Its small footprint allows it to run on a wide range of consumer hardware while still providing useful outputs for many practical tasks.
Quick Start with Ollama
ollama run 3.8b-mini-instruct-4k-q8_0 Quantization Options
| Format | File Size | VRAM Required | Quality | Ollama Tag |
|---|---|---|---|---|
| Q4_K_M | 1.9 GB | 3.8 GB |
★
★
★
★
★
| 3.8b-mini-instruct-4k-q4_K_M |
| Q8_0 recommended | 3.4 GB | 5.8 GB |
★
★
★
★
★
| 3.8b-mini-instruct-4k-q8_0 |
| F16 | 7.2 GB | 9.6 GB |
★
★
★
★
★
| 3.8b-mini-instruct-4k-fp16 |
Compatible Hardware for Q8_0
Showing compatibility for the recommended quantization (Q8_0, 5.8 GB VRAM).
Compatible Hardware
Benchmark Scores
68.8
mmlu