← Back to all models

Phi 3 Medium 128k Instruct

phi-3-medium-128k-instruct

Provider: Nvidia, Context: 128000, Output Limit: 4096

Available at 6 Providers

Provider Source Input Price ($/1M) Output Price ($/1M) Description Free
nvidia models-dev Input: $0.00 Output: $0.00 Provider: Nvidia, Context: 128000, Output Limit: 4096
githubmodels models-dev Input: $0.00 Output: $0.00 Provider: GitHub Models, Context: 128000, Output Limit: 4096
azure models-dev Input: $0.17 Output: $0.68 Provider: Azure, Context: 128000, Output Limit: 4096
azurecognitiveservices models-dev Input: $0.17 Output: $0.68 Provider: Azure Cognitive Services, Context: 128000, Output Limit: 4096
azureai litellm Input: $0.17 Output: $0.68 Source: azure_ai, Context: 128000
openrouter openrouter Input: $1.00 Output: $1.00 Phi-3 128K Medium is a powerful 14-billion parameter model designed for advanced language understanding, reasoning, and instruction following. Optimized through supervised fine-tuning and preference adjustments, it excels in tasks involving common sense, mathematics, logical reasoning, and code processing. At time of release, Phi-3 Medium demonstrated state-of-the-art performance among lightweight models. In the MMLU-Pro eval, the model even comes close to a Llama3 70B level of performance. For 4k context length, try [Phi-3 Medium 4K](/models/microsoft/phi-3-medium-4k-instruct). Context: 128000