AI Providers
Kilo Code supports a wide variety of AI providers, giving you flexibility in how you power your AI-assisted development workflow. Choose from cloud providers, local models, or AI gateways based on your needs.
Getting Started
The fastest way to get started is with Kilo Code's built-in provider, which requires no configuration. Simply sign in and start coding.
For users who want to use their own API keys or need specific models, we support over 30 providers.
Provider Categories
Cloud Providers
Major AI companies offering powerful models via API:
- Anthropic - Claude models (Claude 4, Claude 3.5 Sonnet, etc.)
- OpenAI - GPT-4, GPT-4o, o1, and more
- Google Gemini - Gemini Pro, Gemini Ultra
- DeepSeek - DeepSeek V3., R1
- Mistral - Mistral Large, Codestral
Local & Self-Hosted
Run models on your own hardware for privacy and offline use:
- Ollama - Easy local model management
- LM Studio - Desktop app for local models
- OpenAI Compatible - Any OpenAI-compatible endpoint
AI Gateways
Route requests through unified APIs with additional features:
- OpenRouter - Access multiple providers through one API
- Glama - Enterprise AI gateway
- Requesty - Smart routing and fallbacks
Choosing a Provider
| Priority | Recommended Provider |
|---|---|
| Ease of use | Kilo Code (built-in) |
| Best value | Zhipu AI or Mistral |
| Privacy/Offline | Ollama or LM Studio |
| Enterprise | AWS Bedrock or Google Vertex |
Why Use Multiple Providers?
- Cost - Compare pricing across providers for different tasks
- Reliability - Backup options when a provider has outages
- Models - Access exclusive or specialized models
- Regional - Better latency in certain locations
šNote
All API keys use VS Code's Secret Storageānever stored in plain text.
Next Steps
- New to Kilo Code? Start with the Kilo Code provider - no setup required
- Have an API key? Jump to your provider's page for configuration instructions
- Want to compare? Check out Model Selection for guidance on choosing models