Compatibility Checker

Select your hardware and a model to see if it can run locally with Ollama.

Select a hardware device and a model to check compatibility.
Runs

Model fits comfortably with 15%+ headroom for context and OS.

Runs (tight)

Model fits but with limited context window. May need to close other apps.

CPU Offload

Some layers offloaded to CPU/RAM. Slower but functional.

No

Model is too large even with CPU offloading.