← Back to all models

Githubmodels Models

55 Models
Name Model ID Input Price ($/1M) Output Price ($/1M) Description Free
JAIS 30b Chat jais-30b-chat 0.00 0.00 Provider: GitHub Models, Context: 8192, Output Limit: 2048
Grok 3 grok-3 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 8192
Grok 3 Mini grok-3-mini 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 8192
Cohere Command R 08-2024 cohere-command-r-08-2024 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
Cohere Command A cohere-command-a 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
Cohere Command R+ 08-2024 cohere-command-r-plus-08-2024 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
Cohere Command R cohere-command-r 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
Cohere Command R+ cohere-command-r-plus 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
DeepSeek-R1-0528 deepseek-r1-0528 0.00 0.00 Provider: GitHub Models, Context: 65536, Output Limit: 8192
DeepSeek-R1 deepseek-r1 0.00 0.00 Provider: GitHub Models, Context: 65536, Output Limit: 8192
DeepSeek-V3-0324 deepseek-v3-0324 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 8192
Mistral Medium 3 (25.05) mistral-medium-2505 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 32768
Ministral 3B ministral-3b 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 8192
Mistral Nemo mistral-nemo 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 8192
Mistral Large 24.11 mistral-large-2411 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 32768
Codestral 25.01 codestral-2501 0.00 0.00 Provider: GitHub Models, Context: 32000, Output Limit: 8192
Mistral Small 3.1 mistral-small-2503 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 32768
Phi-3-medium instruct (128k) phi-3-medium-128k-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
Phi-3-mini instruct (4k) phi-3-mini-4k-instruct 0.00 0.00 Provider: GitHub Models, Context: 4096, Output Limit: 1024
Phi-3-small instruct (128k) phi-3-small-128k-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
Phi-3.5-vision instruct (128k) phi-3.5-vision-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
Phi-4 phi-4 0.00 0.00 Provider: GitHub Models, Context: 16000, Output Limit: 4096
Phi-4-mini-reasoning phi-4-mini-reasoning 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
Phi-3-small instruct (8k) phi-3-small-8k-instruct 0.00 0.00 Provider: GitHub Models, Context: 8192, Output Limit: 2048
Phi-3.5-mini instruct (128k) phi-3.5-mini-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
Phi-4-multimodal-instruct phi-4-multimodal-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
Phi-3-mini instruct (128k) phi-3-mini-128k-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
Phi-3.5-MoE instruct (128k) phi-3.5-moe-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
Phi-4-mini-instruct phi-4-mini-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
Phi-3-medium instruct (4k) phi-3-medium-4k-instruct 0.00 0.00 Provider: GitHub Models, Context: 4096, Output Limit: 1024
Phi-4-Reasoning phi-4-reasoning 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
MAI-DS-R1 mai-ds-r1 0.00 0.00 Provider: GitHub Models, Context: 65536, Output Limit: 8192
GPT-4.1-nano gpt-4.1-nano 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 16384
GPT-4.1-mini gpt-4.1-mini 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 16384
OpenAI o1-preview o1-preview 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 32768
OpenAI o3-mini o3-mini 0.00 0.00 Provider: GitHub Models, Context: 200000, Output Limit: 100000
GPT-4o gpt-4o 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 16384
GPT-4.1 gpt-4.1 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 16384
OpenAI o4-mini o4-mini 0.00 0.00 Provider: GitHub Models, Context: 200000, Output Limit: 100000
OpenAI o1 o1 0.00 0.00 Provider: GitHub Models, Context: 200000, Output Limit: 100000
OpenAI o1-mini o1-mini 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 65536
OpenAI o3 o3 0.00 0.00 Provider: GitHub Models, Context: 200000, Output Limit: 100000
GPT-4o mini gpt-4o-mini 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 16384
Llama-3.2-11B-Vision-Instruct llama-3.2-11b-vision-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 8192
Meta-Llama-3.1-405B-Instruct meta-llama-3.1-405b-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 32768
Llama 4 Maverick 17B 128E Instruct FP8 llama-4-maverick-17b-128e-instruct-fp8 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 8192
Meta-Llama-3-70B-Instruct meta-llama-3-70b-instruct 0.00 0.00 Provider: GitHub Models, Context: 8192, Output Limit: 2048
Meta-Llama-3.1-70B-Instruct meta-llama-3.1-70b-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 32768
Llama-3.3-70B-Instruct llama-3.3-70b-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 32768
Llama-3.2-90B-Vision-Instruct llama-3.2-90b-vision-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 8192
Meta-Llama-3-8B-Instruct meta-llama-3-8b-instruct 0.00 0.00 Provider: GitHub Models, Context: 8192, Output Limit: 2048
Llama 4 Scout 17B 16E Instruct llama-4-scout-17b-16e-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 8192
Meta-Llama-3.1-8B-Instruct meta-llama-3.1-8b-instruct 0.00 0.00 Provider: GitHub Models, Context: 128000, Output Limit: 32768
AI21 Jamba 1.5 Large ai21-jamba-1.5-large 0.00 0.00 Provider: GitHub Models, Context: 256000, Output Limit: 4096
AI21 Jamba 1.5 Mini ai21-jamba-1.5-mini 0.00 0.00 Provider: GitHub Models, Context: 256000, Output Limit: 4096