ministral-8b
A more powerful model with faster, memory-efficient inference, ideal for complex workflows and demanding edge applications.
| Provider | Source | Input Price ($/1M) | Output Price ($/1M) | Description | Free |
|---|---|---|---|---|---|
| vercel | vercel | Input: $0.10 | Output: $0.10 | A more powerful model with faster, memory-efficient inference, ideal for complex workflows and demanding edge applications. | |
| openrouter | openrouter | Input: $0.10 | Output: $0.10 | Ministral 8B is an 8B parameter model featuring a unique interleaved sliding-window attention pattern for faster, memory-efficient inference. Designed for edge use cases, it supports up to 128k context length and excels in knowledge and reasoning tasks. It outperforms peers in the sub-10B category, making it perfect for low-latency, privacy-first applications. Context: 131072 |