Xiaomi: MiMo-V2.5
Xiaomi
MiMo-V2.5 is a native omnimodal model by Xiaomi. It delivers Pro-level agentic performance at roughly half the inference cost, while surpassing MiMo-V2-Omni in multimodal perception across image and video understanding...
Maker of the MiMo-V2.5 and MiMo-V2 open-weight coding models. Xiaomi’s coding models on Kilo Code include MiMo-V2.5, MiMo-V2-Pro, and MiMo-V2.5-Pro. Use them across VS Code, JetBrains IDEs, Cursor, Windsurf, Trae, and the Kilo CLI — with pay-as-you-go pricing and no markup over the underlying provider rates.
Sorted by coding-index where published. Click any model for the full review with benchmarks, real-world Kilo usage, and provider-specific pricing.
Xiaomi
MiMo-V2.5 is a native omnimodal model by Xiaomi. It delivers Pro-level agentic performance at roughly half the inference cost, while surpassing MiMo-V2-Omni in multimodal perception across image and video understanding...
Xiaomi
MiMo-V2-Pro is Xiaomi's flagship foundation model, featuring over 1T total parameters and a 1M context length, deeply optimized for agentic scenarios. It is highly adaptable to general agent frameworks like...
Xiaomi
MiMo-V2.5-Pro is Xiaomi’s flagship model, delivering strong performance in general agentic capabilities, complex software engineering, and long-horizon tasks, with top rankings on benchmarks such as ClawEval, GDPVal, and SWE-bench Pro....
Xiaomi is best known for consumer electronics, but its AI division also ships the MiMo family of open-weight language models. MiMo-V2.5-Pro is Xiaomi's current flagship for complex software engineering and long-horizon agentic tasks. MiMo-V2.5 offers Pro-level agentic performance at lower inference cost, while MiMo-V2-Pro and MiMo-V2-Flash round out the long-context and fast-response tiers.
Pay-as-you-go, no markup over the underlying provider rates. Cheapest first.
| Model | Input / 1M | Output / 1M | Context | Coding index |
|---|---|---|---|---|
| Xiaomi: MiMo-V2.5 | $0.400 | $2.00 | 1049K | 42.1 |
| Xiaomi: MiMo-V2-Pro | $1.00 | $3.00 | 1049K | — |
| Xiaomi: MiMo-V2.5-Pro | $1.00 | $3.00 | 1049K | — |
Three ways: hosted in Kilo, locally on your hardware, or through your own provider keys.
The fastest path: install Kilo Code, sign in, pick xiaomi from the model picker. No API keys, no markup. Works in VS Code, JetBrains, Cursor, Windsurf, Trae, and the Kilo CLI.
See live model leaderboard →Already have an account with Xiaomi, OpenRouter, AWS Bedrock, Google Vertex, Together AI, or another compatible provider? Plug your key into Kilo Code and keep your existing billing relationship.
BYOK setup guide →Download Xiaomi open weights from Hugging Face and serve them with Ollama, LM Studio, vLLM, or SGLang. Connect Kilo Code to your local OpenAI-compatible endpoint and keep all prompts on hardware you control.
Local setup guide →See coding-model lineups from Xiaomi’s closest competitors.
Xiaomi’s flagship is among the strongest coding models on the Kilo Code leaderboard, ranked by Code, Plan, Ask, Debug, and Review usage.
VS Code, Cursor, Windsurf, Trae, JetBrains IDEs (IntelliJ, PyCharm, WebStorm, GoLand, RubyMine, Android Studio), and the Kilo CLI / terminal.
Switch between Xiaomi and 500+ other models with one click. Pay only for what you use, at the underlying provider rate.
MiMo-V2.5 is currently the strongest Xiaomi coding model by published Kilo coding index (42.1). MiMo-V2.5-Pro is the flagship for long-horizon agentic work with 1T+ scale and 1M context. MiMo-V2-Pro remains a strong 1M-context option, while MiMo-V2-Flash is the fast lightweight tier.
All Xiaomi MiMo models work in VS Code, Cursor, Windsurf, Trae, JetBrains IDEs, and the Kilo CLI. Open weights can additionally be self-hosted through OpenAI-compatible runtimes such as vLLM or SGLang.
MiMo-V2.5 is $0.40 per million input / $2.00 output on Kilo Code. MiMo-V2.5-Pro and MiMo-V2-Pro are both $1.00 / $3.00. MiMo-V2-Flash pricing is not yet published; check the Kilo model picker for the latest rates.
Yes. Xiaomi MiMo models are open-weight models. You can use them hosted in Kilo Code or self-host compatible weights with vLLM, SGLang, or another OpenAI-compatible runtime.
MiMo-V2.5-Pro is Xiaomi's flagship for complex software engineering, long-horizon tasks, and 1M-context agentic workflows. MiMo-V2.5 delivers similar agentic performance at about half the inference cost. MiMo-V2-Pro is the previous flagship foundation model, and MiMo-V2-Flash is optimised for faster, cheaper everyday coding tasks.
All Xiaomi MiMo models work in VS Code, Cursor, Windsurf, Trae, JetBrains IDEs, and the Kilo CLI. Open weights can additionally be self-hosted through OpenAI-compatible runtimes such as vLLM or SGLang.
Install Kilo Code and get instant access to Xiaomi: MiMo-V2.5 and 2 other Xiaomi models, plus 500+ frontier and open-source options. Free to start, no credit card required.