Mistral

Mistral: Mistral Small 3.1 24B

Released Mar 17, 2025
apache-2.0 license
128,000 context
24 parameters
openmultimodal

Overview

Mistral Small 3.1 24B, released on March 17, 2025, is an open-weight multimodal model from Mistral AI, distributed under the Apache-2.0 license. With around 24B parameters and a 128K token context window, it is available in both base and instruction-tuned (“Instruct”) variants. The model introduces vision support alongside text, enabling tasks like multimodal reasoning, captioning, and image-based Q&A.

It is multilingual, supporting many languages, and is optimized for fast responses, function calling, structured dialogue, and long-context reasoning. Despite its size, the model can be run locally in quantized formats, fitting on machines with ~32GB RAM, making it accessible to developers outside large cloud setups. However, the output length is smaller than the 128K input window, meaning long generations may require chaining. In addition, using full vision features or the maximum context window significantly increases compute costs, and performance on highly complex reasoning or enterprise-scale tasks still trails larger proprietary frontier models.

Performance

Avg. Latency

Model Rankings

Supported Tasks