Model Explorer
Browse and filter 118 models from 9 providers
DeepSeek V3.2
deepseek-ai/deepseek-v3.2
Kimi K2.5
moonshotai/kimi-k2.5
GLM 5
z-ai/glm5
GLM 4.7
z-ai/glm4.7
Kimi K2 Thinking
moonshotai/kimi-k2-thinking
MiniMax M2.1
minimaxai/minimax-m2.1
Step 3.5 Flash
stepfun-ai/step-3.5-flash
Qwen3 Coder 480B
qwen/qwen3-coder-480b-a35b-instruct
Qwen3 235B
qwen/qwen3-235b-a22b
Devstral 2 123B
mistralai/devstral-2-123b-instruct-2512
Qwen3 235B
qwen-3-235b-a22b-instruct-2507
Qwen3 235B
Qwen3-235B
DeepSeek V3.2
DeepSeek-V3.2
Qwen3 Coder
qwen/qwen3-coder:free
Step 3.5 Flash
stepfun/step-3.5-flash:free
DeepSeek V3.1 Term
deepseek-ai/deepseek-v3.1-terminus
Kimi K2 Instruct
moonshotai/kimi-k2-instruct
MiniMax M2
minimaxai/minimax-m2
Qwen3 80B Thinking
qwen/qwen3-next-80b-a3b-thinking
Qwen3 80B Instruct
qwen/qwen3-next-80b-a3b-instruct
Qwen3.5 400B VLM
qwen/qwen3.5-397b-a17b
GPT OSS 120B
openai/gpt-oss-120b
Llama 4 Maverick
meta/llama-4-maverick-17b-128e-instruct
DeepSeek V3.1
deepseek-ai/deepseek-v3.1
Kimi K2 Instruct
moonshotai/kimi-k2-instruct
Kimi K2 0905
moonshotai/kimi-k2-instruct-0905
Llama 4 Maverick
meta-llama/llama-4-maverick-17b-128e-instruct
GPT OSS 120B
openai/gpt-oss-120b
GPT OSS 120B
gpt-oss-120b
DeepSeek V3.1 Term
DeepSeek-V3.1-Terminus
DeepSeek R1 0528
DeepSeek-R1-0528
DeepSeek V3.1
DeepSeek-V3.1
DeepSeek V3 0324
DeepSeek-V3-0324
Llama 4 Maverick
Llama-4-Maverick-17B-128E-Instruct
GPT OSS 120B
gpt-oss-120b
DeepSeek R1 0528
deepseek/deepseek-r1-0528:free
Qwen3 80B Instruct
qwen/qwen3-next-80b-a3b-instruct:free
GPT OSS 120B
openai/gpt-oss-120b:free
Mistral Large
mistral-large-latest
Qwen2.5 72B
qwen2.5-72b-instruct
Nemotron Ultra 253B
nvidia/llama-3.1-nemotron-ultra-253b-v1
Mistral Large 675B
mistralai/mistral-large-3-675b-instruct-2512
QwQ 32B
qwen/qwq-32b
Colosseum 355B
igenius/colosseum_355b_instruct_16k
Qwen3 32B
qwen/qwen3-32b
Compound
groq/compound
GLM 4.7
zai-glm-4.7
Qwen3 32B
Qwen3-32B
Hermes 3 405B
nousresearch/hermes-3-llama-3.1-405b:free
Trinity Large
arcee-ai/trinity-large-preview:free
Mistral Medium 3
mistralai/mistral-medium-3-instruct
Magistral Small
mistralai/magistral-small-2506
Nemotron Super 49B
nvidia/llama-3.3-nemotron-super-49b-v1.5
Llama 4 Scout
meta/llama-4-scout-17b-16e-instruct
Nemotron Nano 30B
nvidia/nemotron-3-nano-30b-a3b
R1 Distill 32B
deepseek-ai/deepseek-r1-distill-qwen-32b
GPT OSS 20B
openai/gpt-oss-20b
Qwen2.5 Coder 32B
qwen/qwen2.5-coder-32b-instruct
Llama 3.1 405B
meta/llama-3.1-405b-instruct
Llama 4 Scout
meta-llama/llama-4-scout-17b-16e-instruct
GPT OSS 20B
openai/gpt-oss-20b
Compound Mini
groq/compound-mini
R1 Distill 70B
DeepSeek-R1-Distill-Llama-70B
Solar Pro 3
upstage/solar-pro-3:free
GLM 4.5 Air
z-ai/glm-4.5-air:free
GPT OSS 20B
openai/gpt-oss-20b:free
Nemotron Nano 30B
nvidia/nemotron-3-nano-30b-a3b:free
Mistral Medium
mistral-medium-latest
Qwen2.5 Coder 32B
qwen2.5-coder-32b-instruct
Llama 3.3 70B
meta/llama-3.3-70b-instruct
R1 Distill 14B
deepseek-ai/deepseek-r1-distill-qwen-14b
Seed OSS 36B
bytedance/seed-oss-36b-instruct
Stockmark 100B
stockmark/stockmark-2-100b-instruct
Llama 3.3 70B
llama-3.3-70b-versatile
Llama 3.3 70B
Meta-Llama-3.3-70B-Instruct
Llama 3.3 70B
meta-llama/llama-3.3-70b-instruct:free
Mistral Small 3.1
mistralai/mistral-small-3.1-24b-instruct:free
Mistral Small
mistral-small-latest
Llama 3.3 70B
llama-3.3-70b-instruct
Mistral Small 24B
mistral-small-24b-instruct-2501
Mixtral 8x22B
mistralai/mixtral-8x22b-instruct-v0.1
Ministral 14B
mistralai/ministral-14b-instruct-2512
Granite 34B Code
ibm/granite-34b-code-instruct
Llama 3.2 3B
meta-llama/llama-3.2-3b-instruct:free
Trinity Mini
arcee-ai/trinity-mini:free
Codestral
codestral-latest
Mistral Nemo
open-mistral-nemo
Codestral Mamba
open-codestral-mamba
Devstral Small
devstral-small-latest
Mistral Nemo
mistral-nemo-instruct-2407
R1 Distill 8B
deepseek-ai/deepseek-r1-distill-llama-8b
R1 Distill 7B
deepseek-ai/deepseek-r1-distill-qwen-7b
Llama 3.1 8B
llama-3.1-8b-instant
Llama 3.1 8B
llama3.1-8b
Llama 3.1 8B
Meta-Llama-3.1-8B-Instruct
Gemma 3 27B
google/gemma-3-27b-it:free
Nemotron Nano 12B VL
nvidia/nemotron-nano-12b-v2-vl:free
Dolphin Venice
cognitivecomputations/dolphin-mistral-24b-venice-edition:free
Gemma 3 27B
gemma-3-27b-it
Ministral 8B
ministral-8b-latest
Llama 3.1 8B
llama-3.1-8b-instruct
Nemotron Nano 9B
nvidia/nemotron-nano-9b-v2:free
Ministral 3B
ministral-3b-latest
Qwen3 4B
qwen/qwen3-4b:free
Gemma 2 9B
google/gemma-2-9b-it
Phi 3.5 Mini
microsoft/phi-3.5-mini-instruct
Phi 4 Mini
microsoft/phi-4-mini-instruct
Gemma 3 12B
google/gemma-3-12b-it:free
Gemma 3 4B
google/gemma-3-4b-it:free
Gemma 3n 4B
google/gemma-3n-e4b-it:free
Gemma 3n 2B
google/gemma-3n-e2b-it:free
Gemma 3 12B
gemma-3-12b-it
Gemma 3 4B
gemma-3-4b-it
Gemma 3n 4B
gemma-3n-e4b-it
LFM 2.5 1.2B
liquid/lfm-2.5-1.2b-instruct:free
LFM 2.5 Thinking
liquid/lfm-2.5-1.2b-thinking:free
Gemma 3 1B
gemma-3-1b-it
Gemma 3n 2B
gemma-3n-e2b-it
| Model | Tier | SWE Score | Context | Provider | Status | Actions |
|---|---|---|---|---|---|---|
DeepSeek V3.2 deepseek-ai/deepseek-v3.2 | S+ | 73.1% | 128k | NIM | ||
Kimi K2.5 moonshotai/kimi-k2.5 | S+ | 76.8% | 128k | NIM | ||
GLM 5 z-ai/glm5 | S+ | 77.8% | 128k | NIM | ||
GLM 4.7 z-ai/glm4.7 | S+ | 73.8% | 200k | NIM | ||
Kimi K2 Thinking moonshotai/kimi-k2-thinking | S+ | 71.3% | 256k | NIM | ||
MiniMax M2.1 minimaxai/minimax-m2.1 | S+ | 74.0% | 200k | NIM | ||
Step 3.5 Flash stepfun-ai/step-3.5-flash | S+ | 74.4% | 256k | NIM | ||
Qwen3 Coder 480B qwen/qwen3-coder-480b-a35b-instruct | S+ | 70.6% | 256k | NIM | ||
Qwen3 235B qwen/qwen3-235b-a22b | S+ | 70.0% | 128k | NIM | ||
Devstral 2 123B mistralai/devstral-2-123b-instruct-2512 | S+ | 72.2% | 256k | NIM | ||
Qwen3 235B qwen-3-235b-a22b-instruct-2507 | S+ | 70.0% | 128k | Cerebras | ||
Qwen3 235B Qwen3-235B | S+ | 70.0% | 128k | SambaNova | ||
DeepSeek V3.2 DeepSeek-V3.2 | S+ | 68.0% | 128k | SambaNova | ||
Qwen3 Coder qwen/qwen3-coder:free | S+ | 70.6% | 256k | OpenRouter | ||
Step 3.5 Flash stepfun/step-3.5-flash:free | S+ | 74.4% | 256k | OpenRouter | ||
DeepSeek V3.1 Term deepseek-ai/deepseek-v3.1-terminus | S | 68.4% | 128k | NIM | ||
Kimi K2 Instruct moonshotai/kimi-k2-instruct | S | 65.8% | 128k | NIM | ||
MiniMax M2 minimaxai/minimax-m2 | S | 69.4% | 128k | NIM | ||
Qwen3 80B Thinking qwen/qwen3-next-80b-a3b-thinking | S | 68.0% | 128k | NIM | ||
Qwen3 80B Instruct qwen/qwen3-next-80b-a3b-instruct | S | 65.0% | 128k | NIM | ||
Qwen3.5 400B VLM qwen/qwen3.5-397b-a17b | S | 68.0% | 128k | NIM | ||
GPT OSS 120B openai/gpt-oss-120b | S | 60.0% | 128k | NIM | ||
Llama 4 Maverick meta/llama-4-maverick-17b-128e-instruct | S | 62.0% | 1M | NIM | ||
DeepSeek V3.1 deepseek-ai/deepseek-v3.1 | S | 62.0% | 128k | NIM | ||
Kimi K2 Instruct moonshotai/kimi-k2-instruct | S | 65.8% | 131k | Groq | ||
Kimi K2 0905 moonshotai/kimi-k2-instruct-0905 | S | 65.8% | 262k | Groq | ||
Llama 4 Maverick meta-llama/llama-4-maverick-17b-128e-instruct | S | 62.0% | 1M | Groq | ||
GPT OSS 120B openai/gpt-oss-120b | S | 60.0% | 128k | Groq | ||
GPT OSS 120B gpt-oss-120b | S | 60.0% | 128k | Cerebras | ||
DeepSeek V3.1 Term DeepSeek-V3.1-Terminus | S | 68.4% | 128k | SambaNova | ||
DeepSeek R1 0528 DeepSeek-R1-0528 | S | 61.0% | 128k | SambaNova | ||
DeepSeek V3.1 DeepSeek-V3.1 | S | 62.0% | 128k | SambaNova | ||
DeepSeek V3 0324 DeepSeek-V3-0324 | S | 62.0% | 128k | SambaNova | ||
Llama 4 Maverick Llama-4-Maverick-17B-128E-Instruct | S | 62.0% | 1M | SambaNova | ||
GPT OSS 120B gpt-oss-120b | S | 60.0% | 128k | SambaNova | ||
DeepSeek R1 0528 deepseek/deepseek-r1-0528:free | S | 61.0% | 128k | OpenRouter | ||
Qwen3 80B Instruct qwen/qwen3-next-80b-a3b-instruct:free | S | 65.0% | 128k | OpenRouter | ||
GPT OSS 120B openai/gpt-oss-120b:free | S | 60.0% | 128k | OpenRouter | ||
Mistral Large mistral-large-latest | S | 62.0% | 128k | Mistral AI | ||
Qwen2.5 72B qwen2.5-72b-instruct | S | 65.0% | 128k | Scaleway | ||
Nemotron Ultra 253B nvidia/llama-3.1-nemotron-ultra-253b-v1 | A+ | 56.0% | 128k | NIM | ||
Mistral Large 675B mistralai/mistral-large-3-675b-instruct-2512 | A+ | 58.0% | 256k | NIM | ||
QwQ 32B qwen/qwq-32b | A+ | 50.0% | 131k | NIM | ||
Colosseum 355B igenius/colosseum_355b_instruct_16k | A+ | 52.0% | 16k | NIM | ||
Qwen3 32B qwen/qwen3-32b | A+ | 50.0% | 131k | Groq | ||
Compound groq/compound | A+ | 52.0% | 131k | Groq | ||
GLM 4.7 zai-glm-4.7 | A+ | 52.0% | 128k | Cerebras | ||
Qwen3 32B Qwen3-32B | A+ | 50.0% | 128k | SambaNova | ||
Hermes 3 405B nousresearch/hermes-3-llama-3.1-405b:free | A+ | 50.0% | 128k | OpenRouter | ||
Trinity Large arcee-ai/trinity-large-preview:free | A+ | 48.0% | 128k | OpenRouter | ||
Mistral Medium 3 mistralai/mistral-medium-3-instruct | A | 48.0% | 128k | NIM | ||
Magistral Small mistralai/magistral-small-2506 | A | 45.0% | 32k | NIM | ||
Nemotron Super 49B nvidia/llama-3.3-nemotron-super-49b-v1.5 | A | 49.0% | 128k | NIM | ||
Llama 4 Scout meta/llama-4-scout-17b-16e-instruct | A | 44.0% | 10M | NIM | ||
Nemotron Nano 30B nvidia/nemotron-3-nano-30b-a3b | A | 43.0% | 128k | NIM | ||
R1 Distill 32B deepseek-ai/deepseek-r1-distill-qwen-32b | A | 43.9% | 128k | NIM | ||
GPT OSS 20B openai/gpt-oss-20b | A | 42.0% | 128k | NIM | ||
Qwen2.5 Coder 32B qwen/qwen2.5-coder-32b-instruct | A | 46.0% | 32k | NIM | ||
Llama 3.1 405B meta/llama-3.1-405b-instruct | A | 44.0% | 128k | NIM | ||
Llama 4 Scout meta-llama/llama-4-scout-17b-16e-instruct | A | 44.0% | 10M | Groq | ||
GPT OSS 20B openai/gpt-oss-20b | A | 42.0% | 128k | Groq | ||
Compound Mini groq/compound-mini | A | 45.0% | 131k | Groq | ||
R1 Distill 70B DeepSeek-R1-Distill-Llama-70B | A | 43.9% | 128k | SambaNova | ||
Solar Pro 3 upstage/solar-pro-3:free | A | 45.0% | 128k | OpenRouter | ||
GLM 4.5 Air z-ai/glm-4.5-air:free | A | 44.0% | 128k | OpenRouter | ||
GPT OSS 20B openai/gpt-oss-20b:free | A | 42.0% | 128k | OpenRouter | ||
Nemotron Nano 30B nvidia/nemotron-3-nano-30b-a3b:free | A | 43.0% | 128k | OpenRouter | ||
Mistral Medium mistral-medium-latest | A | 48.0% | 128k | Mistral AI | ||
Qwen2.5 Coder 32B qwen2.5-coder-32b-instruct | A | 46.0% | 32k | Scaleway | ||
Llama 3.3 70B meta/llama-3.3-70b-instruct | A- | 39.5% | 128k | NIM | ||
R1 Distill 14B deepseek-ai/deepseek-r1-distill-qwen-14b | A- | 37.7% | 64k | NIM | ||
Seed OSS 36B bytedance/seed-oss-36b-instruct | A- | 38.0% | 32k | NIM | ||
Stockmark 100B stockmark/stockmark-2-100b-instruct | A- | 36.0% | 32k | NIM | ||
Llama 3.3 70B llama-3.3-70b-versatile | A- | 39.5% | 128k | Groq | ||
Llama 3.3 70B Meta-Llama-3.3-70B-Instruct | A- | 39.5% | 128k | SambaNova | ||
Llama 3.3 70B meta-llama/llama-3.3-70b-instruct:free | A- | 39.5% | 128k | OpenRouter | ||
Mistral Small 3.1 mistralai/mistral-small-3.1-24b-instruct:free | A- | 38.0% | 128k | OpenRouter | ||
Mistral Small mistral-small-latest | A- | 38.0% | 128k | Mistral AI | ||
Llama 3.3 70B llama-3.3-70b-instruct | A- | 39.5% | 128k | Scaleway | ||
Mistral Small 24B mistral-small-24b-instruct-2501 | A- | 38.0% | 128k | Scaleway | ||
Mixtral 8x22B mistralai/mixtral-8x22b-instruct-v0.1 | B+ | 32.0% | 64k | NIM | ||
Ministral 14B mistralai/ministral-14b-instruct-2512 | B+ | 34.0% | 32k | NIM | ||
Granite 34B Code ibm/granite-34b-code-instruct | B+ | 30.0% | 32k | NIM | ||
Llama 3.2 3B meta-llama/llama-3.2-3b-instruct:free | B+ | 35.0% | 128k | OpenRouter | ||
Trinity Mini arcee-ai/trinity-mini:free | B+ | 34.0% | 128k | OpenRouter | ||
Codestral codestral-latest | B+ | 34.0% | 256k | Codestral | ||
Mistral Nemo open-mistral-nemo | B+ | 32.0% | 131k | Mistral AI | ||
Codestral Mamba open-codestral-mamba | B+ | 34.0% | 256k | Mistral AI | ||
Devstral Small devstral-small-latest | B+ | 34.0% | 128k | Mistral AI | ||
Mistral Nemo mistral-nemo-instruct-2407 | B+ | 32.0% | 128k | Scaleway | ||
R1 Distill 8B deepseek-ai/deepseek-r1-distill-llama-8b | B | 28.2% | 32k | NIM | ||
R1 Distill 7B deepseek-ai/deepseek-r1-distill-qwen-7b | B | 22.6% | 32k | NIM | ||
Llama 3.1 8B llama-3.1-8b-instant | B | 28.8% | 128k | Groq | ||
Llama 3.1 8B llama3.1-8b | B | 28.8% | 128k | Cerebras | ||
Llama 3.1 8B Meta-Llama-3.1-8B-Instruct | B | 28.8% | 128k | SambaNova | ||
Gemma 3 27B google/gemma-3-27b-it:free | B | 22.0% | 128k | OpenRouter | ||
Nemotron Nano 12B VL nvidia/nemotron-nano-12b-v2-vl:free | B | 25.0% | 128k | OpenRouter | ||
Dolphin Venice cognitivecomputations/dolphin-mistral-24b-venice-edition:free | B | 28.0% | 128k | OpenRouter | ||
Gemma 3 27B gemma-3-27b-it | B | 22.0% | 128k | Google AI | ||
Ministral 8B ministral-8b-latest | B | 25.0% | 128k | Mistral AI | ||
Llama 3.1 8B llama-3.1-8b-instruct | B | 28.8% | 128k | Scaleway | ||
Nemotron Nano 9B nvidia/nemotron-nano-9b-v2:free | B- | 20.0% | 128k | OpenRouter | ||
Ministral 3B ministral-3b-latest | B- | 20.0% | 128k | Mistral AI | ||
Qwen3 4B qwen/qwen3-4b:free | C+ | 18.0% | 128k | OpenRouter | ||
Gemma 2 9B google/gemma-2-9b-it | C | 18.0% | 8k | NIM | ||
Phi 3.5 Mini microsoft/phi-3.5-mini-instruct | C | 12.0% | 128k | NIM | ||
Phi 4 Mini microsoft/phi-4-mini-instruct | C | 14.0% | 128k | NIM | ||
Gemma 3 12B google/gemma-3-12b-it:free | C | 15.0% | 128k | OpenRouter | ||
Gemma 3 4B google/gemma-3-4b-it:free | C | 10.0% | 128k | OpenRouter | ||
Gemma 3n 4B google/gemma-3n-e4b-it:free | C | 12.0% | 128k | OpenRouter | ||
Gemma 3n 2B google/gemma-3n-e2b-it:free | C | 8.0% | 128k | OpenRouter | ||
Gemma 3 12B gemma-3-12b-it | C | 15.0% | 128k | Google AI | ||
Gemma 3 4B gemma-3-4b-it | C | 10.0% | 128k | Google AI | ||
Gemma 3n 4B gemma-3n-e4b-it | C | 12.0% | 128k | Google AI | ||
LFM 2.5 1.2B liquid/lfm-2.5-1.2b-instruct:free | D | 5.0% | 128k | OpenRouter | ||
LFM 2.5 Thinking liquid/lfm-2.5-1.2b-thinking:free | D | 5.0% | 128k | OpenRouter | ||
Gemma 3 1B gemma-3-1b-it | D | 5.0% | 128k | Google AI | ||
Gemma 3n 2B gemma-3n-e2b-it | D | 8.0% | 128k | Google AI |