StarCoder2 15B
by BigCode · starcoder family
15B
parameters
text-generation code-generation
StarCoder2 15B is the largest model from the BigCode project, trained on The Stack v2 dataset covering over 600 programming languages. It is one of the most capable open code-generation models, excelling at code completion, infilling, and generation across a wide variety of languages. The model supports 16K context and is trained with fill-in-the-middle capabilities, making it particularly effective for IDE-style code completion. Its permissive license and strong benchmark performance make it a solid choice for local coding assistance.
Quick Start with Ollama
ollama run 15b-q8_0 | Creator | BigCode |
| Parameters | 15B |
| Architecture | transformer-decoder |
| Context Length | 16K tokens |
| License | BigCode Open RAIL-M v1 |
| Released | Feb 28, 2024 |
| Ollama | starcoder2 |
Quantization Options
| Format | File Size | VRAM Required | Quality | Ollama Tag |
|---|---|---|---|---|
| Q4_K_M | 7.6 GB | 10.5 GB |
★
★
★
★
★
| 15b-q4_K_M |
| Q5_K_M | 8.8 GB | 12 GB |
★
★
★
★
★
| 15b-q5_K_M |
| Q8_0 recommended | 13.5 GB | 17 GB |
★
★
★
★
★
| 15b-q8_0 |
Compatible Hardware for Q8_0
Showing compatibility for the recommended quantization (Q8_0, 17 GB VRAM).
Compatible Hardware
5 hardware
device(s) cannot run this model configuration.
Benchmark Scores
54.0
mmlu