A lightweight LiteLLM server boilerplate pre-configured with uv and Docker for hosting your own OpenAI- and Anthropic-compatible endpoints. Includes LibreChat as an optional web UI.
-
Updated
Dec 8, 2025 - Python
A lightweight LiteLLM server boilerplate pre-configured with uv and Docker for hosting your own OpenAI- and Anthropic-compatible endpoints. Includes LibreChat as an optional web UI.
High-performance, async-first system that computes and monitors LLM API spend across millions of requests with PostgreSQL and Prometheus.
Claude Code Router (LiteLLM)
Add a description, image, and links to the litellm-proxy topic page so that developers can more easily learn about it.
To associate your repository with the litellm-proxy topic, visit your repo's landing page and select "manage topics."