Models / Mistral Large

Mistral Large

proprietary

Mistral's flagship model with 128K context. Strong multilingual capabilities and function calling.

mistral/mistral-large
Context Window
131K
Max Output
16K
Providers
1
Released
2025-03

Capabilities

chatcodereasoningtoolsfunction_callingstreamingjson_mode

Pricing by Provider

ProviderInput $/1MOutput $/1MLatency p50Latency p95Status
mistral$2.00$6.00350ms1000ms

Quick Start

Python
import magicrouter

mr = magicrouter.Client(
    provider_keys={"mistral": "your-api-key"}
)

response = mr.chat(
    "mistral/mistral-large",
    "Your prompt here"
)
print(response.choices[0].message.content)
TypeScript
import { MagicRouter } from "magicrouter";

const mr = new MagicRouter({
  providerKeys: { mistral: "your-api-key" }
});

const response = await mr.chat({
  model: "mistral/mistral-large",
  messages: [{ role: "user", content: "Your prompt here" }]
});
console.log(response.choices[0].message.content);
cURL
curl https://api.mistral.ai/v1/chat/completions \
  -H "Authorization: Bearer your-api-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "mistral-large-latest",
    "messages": [{"role": "user", "content": "Your prompt here"}]
  }'

Use this model

Sign up for free and test Mistral Large in the playground

Get Started
Mistral Large — MagicRouter