LiteLLM
Featured Open SourceUnified API for 100+ LLMs — OpenAI-compatible proxy
LiteLLM provides a unified OpenAI-compatible API interface across 100+ LLM providers including OpenAI, Anthropic, Gemini, Mistral, and self-hosted models. It includes a proxy server, cost tracking, rate limiting, and load balancing.
Product Overview
Use Cases
- Multi-LLM Routing
- LLM Cost Tracking
- Provider Fallback
- AI Gateway
Ideal For
AI Platform TeamsEnterprise AI ArchitectsCost-Conscious AI Teams
Architecture Fit
Enterprise ReadySelf HostedCloud NativeAPI FirstMulti-Agent CompatibleKubernetes SupportOpen Source
Technical Details
- Deployment Model
- self-hosted
- LLM Providers
- OpenAIAnthropicGeminiMistralOllamaAzure OpenAI
Screenshots
No screenshots available yet.
Community Feedback
Loading…
Login to leave feedback on this product.