glm-4.7
GLM-4.7 is Z.ai’s latest flagship model, with major upgrades focused on two key areas: stronger coding capabilities and more stable multi-step reasoning and execution.
| Provider | Source | Input Price ($/1M) | Output Price ($/1M) | Description | Free |
|---|---|---|---|---|---|
| vercel | vercel | Input: $0.43 | Output: $1.75 | GLM-4.7 is Z.ai’s latest flagship model, with major upgrades focused on two key areas: stronger coding capabilities and more stable multi-step reasoning and execution. | |
| poe | poe | Input: - | Output: - | GLM-4.7 is Z.AI's latest flagship model, with major upgrades focused on advanced coding capabilities and more reliable multi-step reasoning and execution. It shows clear gains in complex agent workflows, while delivering a more natural conversational experience and stronger front-end design sensibility. File Support: Text, Markdown and PDF files Context window: 205k tokens Optional parameters: Use `--enable_thinking true` to enable thinking about the response before giving a final answer. This is disabled by default. Use `--temperature` and set number from 0 to 2 to control randomness in the response. Lower values make the output more focused and deterministic. This is set to 0.7 by default Use `max_output_token` and set number from 1 to 131072 to set number of tokens to generate in response. This is set to 131072 by default. | |
| baseten | models-dev | Input: $0.60 | Output: $2.20 | Provider: Baseten, Context: 204800, Output Limit: 131072 | |
| zenmux | models-dev | Input: $0.28 | Output: $1.14 | Provider: ZenMux, Context: 200000, Output Limit: 64000 | |
| synthetic | models-dev | Input: $0.55 | Output: $2.19 | Provider: Synthetic, Context: 200000, Output Limit: 64000 | |
| deepinfra | models-dev | Input: $0.43 | Output: $1.75 | Provider: Deep Infra, Context: 202752, Output Limit: 16384 | |
| zhipuai | models-dev | Input: $0.60 | Output: $2.20 | Provider: Zhipu AI, Context: 204800, Output Limit: 131072 | |
| nanogpt | models-dev | Input: $1.00 | Output: $2.00 | Provider: NanoGPT, Context: 204800, Output Limit: 8192 | |
| aihubmix | models-dev | Input: $0.27 | Output: $1.10 | Provider: AIHubMix, Context: 204800, Output Limit: 131072 | |
| openrouter | openrouter | Input: $0.16 | Output: $0.80 | GLM-4.7 is Z.AI’s latest flagship model, featuring upgrades in two key areas: enhanced programming capabilities and more stable multi-step reasoning/execution. It demonstrates significant improvements in executing complex agent tasks while delivering more natural conversational experiences and superior front-end aesthetics. Context: 202752 | |
| zai | zai | Input: $0.60 | Output: $0.11 | - |