openai-proxy
OpenAI Proxy (100+ LLMs) - OpenAI, Azure, Bedrock, Anthropic, HuggingFace
openai-proxy
BerriAI/litellm
Just deployed
A fast, and lightweight OpenAI-compatible server to call 100+ LLM APIs.
Call all LLM APIs using the OpenAI format. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs)
Test your deployed proxy import openai openai.api_base = "http://0.0.0.0:8000"
print(openai.ChatCompletion.create(model="test", messages=[{"role":"user", "content":"Hey!"}]))
Template Content
openai-proxy
BerriAI/litellm