Mistral AI Launches Medium 3.5 Language Model

Key Takeaways
- 1Mistral introduces 128B parameter Medium 3.5 model with cloud-coding features.
- 2Enhances AI capabilities for self-hosting with low hardware requirements.
- 3Fosters data sovereignty with open-weight models and competitive pricing.
Mistral, the French AI startup, has announced the launch of its Medium 3.5 language model, designed for self-hosting and operational efficiency with just four GPUs. The new model integrates 128 billion parameters and features a context window of 256,000 tokens, enabling versatile functions such as instruction-following, reasoning, and coding tasks. The deployment approach allows configurable computational effort for different types of requests, an advancement positioning Mistral against U.S. and Chinese competitors.
This initiative notably improves AI architecture by replacing the Devstral 2 in Mistral’s Vibe-CLI and setting a new standard for its Le Chat AI assistant. The launch also emphasizes data sovereignty, as the model's weights will be available under a modified MIT license on Hugging Face. Furthermore, competitive API pricing of $1.50 per million input tokens enhances access to advanced AI capabilities, aligning with national strategies to reduce reliance on foreign technology while promoting local innovations.
Related Sovereign AI Articles

VAD Technologies Partners with Data Dynamics for SovereignAI

Alphabet Commits $190B for AI and Cloud Infrastructure

OpenAI Achieves 10 Gigawatt AI Compute Ahead of Schedule

Tencent Unveils 440MB AI Model for Offline Translation
