OpenAI-Proxy

OpenAI-Proxy

OpenAI Proxy (100+ LLMs) - OpenAI, Azure, Bedrock, Anthropic, HuggingFace

Deploy OpenAI-Proxy

litellm

berriai/litellm:main-stable

Just deployed

A fast, and lightweight OpenAI-compatible server to call 100+ LLM APIs.

Call all LLM APIs using the OpenAI format. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs)

Test your deployed proxy:

import openai
client = openai.OpenAI(
    api_key="your-master-key",
    base_url="your-proxy-url"
)

# request sent to model set on litellm proxy, `litellm --model`
response = client.chat.completions.create(
    model="gpt-3.5-turbo",
    messages = [
        {
            "role": "user",
            "content": "this is a test request, write a short poem"
        }
    ]
)

print(response)

Template Content

More templates in this category

New

Chat Chat

Chat Chat

Chat Chat, your own unified chat and search to AI platform.


0

New

openui

openui

Deploy OpenUI: AI-powered UI generation with GitHub OAuth and OpenAI API.


0

New

firecrawl

firecrawl

firecrawl api server + worker without auth, works with dify


0