changelogs.info
OpenClaw Claude Code Codex Gemini Kilo Code Hermes Models Dispatches
// model_detail
legacy open-weight

Mixtral 8x7B

by Mistral

Pioneering open-weight MoE model. 8 experts, 7B each. 32K context.

Context window 32K 32K tokens
Max output 8K 8K tokens
Pricing /1M tok $0.10 in / $0.30 out per 1M tokens (OpenRouter)
Released Dec 11, 2023
// specifications

Specs & capabilities

API details

API model ID
open-mixtral-8x7b
Internal ID
mixtral-8x7b
Status
legacy
Type
open-weight

Capabilities

Vision
No
Function calling
🔧 Yes
Knowledge cutoff
📅 2023-12

Pricing

Input
$0.10 / 1M tokens
Output
$0.30 / 1M tokens
Source
OpenRouter pricing
// data_confidence

Data notes

⚠️ Pricing shown via OpenRouter, not first-party
// notes

Additional notes

Pioneering MoE model

// more_from_mistral

More Mistral models

// external_links

Links