LiteLLM Proxy Server

LiteLLM Proxy Server

A self-contained template for LiteLLM Proxy Server with DB/Cache

Deploy LiteLLM Proxy Server

litellm

berriai/litellm:main-stable

Just deployed

Postgres

railwayapp-templates/postgres-ssl:16

Just deployed

/var/lib/postgresql/data

Redis

bitnami/redis:7.2.5

Just deployed

/bitnami

LiteLLM Proxy Server Deployment Guide

This guide walks through deploying a LiteLLM Proxy Server instance with pre-configured setup for Postgres DB access, Redis cache, and UI panel access.

Configuration Management

LiteLLM is designed to load configuration settings via config.yaml. Since Railway doesn't directly support file management while maintaining easy image deployment via the LiteLLM git repository, we'll use AWS S3 to host and manage the configuration file.

For complete configuration options, refer to the official LiteLLM documentation.

Sample Configuration File

Here's a sample config.yaml that demonstrates configuring an OpenRouter model with Redis caching and Langfuse callbacks. This is what I personally used in my basic deployment:

model_list:
  - model_name: gpt-4o-openrouter
    litellm_params:
      model: openrouter/openai/gpt-4o-2024-11-20
      api_base: https://openrouter.ai/api/v1
      api_key: your-openrouter-key
      input_cost_per_token: 0.0000025
      output_cost_per_token: 0.000010
      rpm: 300

litellm_settings:
  success_callback: ["langfuse"]
  cache: True
  cache_params:
    type: "redis"

S3 Configuration Setup

  1. Create S3 Bucket

    • Navigate to AWS S3 Console
    • Create new bucket with these settings:
      • Name: your-litellm-configs
      • Region: your-region
      • Object Ownership: ACLs disabled
      • Block Public Access: Enable all
      • Bucket Versioning: Enable
      • Default encryption: SSE-S3
  2. Create IAM User and Policy

    • Go to AWS IAM Console
    • Create new user named litellm-config-user
    • Create this policy named litellm-config-access:
      {
          "Version": "2012-10-17",
          "Statement": [
              {
                  "Effect": "Allow",
                  "Action": [
                      "s3:GetObject",
                      "s3:ListBucket",
                      "s3:PutObject"
                  ],
                  "Resource": [
                      "arn:aws:s3:::your-litellm-configs",
                      "arn:aws:s3:::your-litellm-configs/*"
                  ]
              }
          ]
      }
      
    • Attach policy to user
    • Create access keys for "Application running outside AWS"
    • Save both the Access Key ID and Secret Access Key
  3. Upload Configuration

    • Create your config.yaml file locally
    • Upload to your S3 bucket

Railway Environment Setup

Configure these environment variables in your Railway project:

  • LITELLM_CONFIG_BUCKET_NAME=your-litellm-configs
  • LITELLM_CONFIG_BUCKET_OBJECT_KEY=config.yaml
  • AWS_ACCESS_KEY_ID=your_access_key_id
  • AWS_SECRET_ACCESS_KEY=your_secret_access_key
  • AWS_REGION_NAME=your_region

For additional configuration options and advanced settings, please refer to the LiteLLM Proxy documentation.


Template Content

Deploy Now

Details

Will Bogusz's Projects

Created on Jan 19, 2025

32 total projects

10 active projects

88% success on recent deploys

AI/ML



More templates in this category

View Template
Chat Chat

Chat Chat

Chat Chat, your own unified chat and search to AI platform.


Harry Yep's Projects

View Template
openui

openui

Deploy OpenUI: AI-powered UI generation with GitHub OAuth and OpenAI API.


zexd's Projects

View Template
firecrawl

firecrawl

firecrawl api server + worker without auth, works with dify


Rama