openai-proxy

OpenAI Proxy (100+ LLMs) - OpenAI, Azure, Bedrock, Anthropic, HuggingFace

Deploy openai-proxy

openai-proxy

BerriAI/litellm

Just deployed

A fast, and lightweight OpenAI-compatible server to call 100+ LLM APIs.

Call all LLM APIs using the OpenAI format. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs)

Test your deployed proxy import openai openai.api_base = "http://0.0.0.0:8000"

print(openai.ChatCompletion.create(model="test", messages=[{"role":"user", "content":"Hey!"}]))


Template Content

openai-proxy

BerriAI/litellm

More templates in this category

View Template

Chat Chat

Chat Chat, your own unified chat and search to AI platform.


View Template

openui

Deploy OpenUI: AI-powered UI generation with GitHub OAuth and OpenAI API.


View Template

firecrawl

firecrawl api server + worker without auth, works with dify