Deploy LibreChat
Open-source ChatGPT clone: multi-model, agents, search, multi-user auth
VectorDB 🗃️
pgvector/pgvector:pg16
Just deployed
/var/lib/postgresql/data
RAG API 📚
danny-avila/librechat-rag-api-dev-lite:latest
Just deployed
LibreChat 🪶
danny-avila/librechat-dev:latest
Just deployed
Meilisearch 🔍
getmeili/meilisearch:v1.11.3
Just deployed
/meili_data
MongoDB 🍃
mongo
Just deployed
/data/db
LibreChat on Railway
⚠️ BEFORE DEPLOYMENT
Make sure to review and correctly configure your .env
file using the official guide:
👉 LibreChat Configuration Guide
💡 What is LibreChat?
LibreChat is an open-source, ChatGPT-style AI chat platform built for flexibility, privacy, and full customization. It supports multiple AI providers (OpenAI, Anthropic, Azure, Google, Ollama, Bedrock, etc.) and includes advanced tools such as a code interpreter, file handling, agent builder, and multimodal AI interactions—all in a modern web UI.
☁️ About Hosting LibreChat
Hosting LibreChat on Railway provides a seamless cloud environment for full-stack AI applications. Deployment takes minutes and requires minimal setup. Railway automatically provisions your backend, database, and static frontend while allowing scalable infrastructure with no manual configuration.
You can connect your environment variables from LibreChat’s .env
configuration and instantly deploy a production-ready AI platform, complete with authentication, model selection, agents, and more.
🔧 Common Use Cases
- Build a private ChatGPT alternative with multiple AI backends.
- Host a multi-user AI workspace with authentication and moderation tools.
- Deploy a custom AI assistant or agent platform with no-code tools.
- Integrate local or remote AI models (Ollama, vLLM, Groq, Bedrock).
- Run secure AI file processing and code execution environments.
📦 Dependencies for LibreChat Hosting
- Node.js 18+ (recommended 20.x)
- MongoDB (use Railway’s built-in database or connect an external instance)
- Redis (optional but recommended) for caching and session handling
Deployment Dependencies
- Environment configuration:
.env
setup guide - AI endpoints setup: librechat.yaml reference
- Custom endpoints: OpenAI-compatible APIs
⚙️ Implementation Details
LibreChat’s Railway template automatically deploys the backend (Express + MongoDB) and serves the production frontend built with React + Vite. After deployment:
- Go to Railway’s dashboard → Variables → Add
.env
keys from the config guide. - Redeploy the service.
- Access your hosted LibreChat instance from the Railway-generated domain.
Optional integrations include:
- File uploads (S3-compatible storage)
- OpenAI, Azure, or Anthropic API keys
- OAuth2 / Email login
🚀 Why Deploy LibreChat on Railway?
Railway is a unified deployment platform that eliminates infrastructure complexity. It automatically handles build, configuration, and scaling for your stack, allowing you to focus on development.
By deploying LibreChat on Railway, you instantly get:
- Automated deployment with a single click
- Built-in database hosting
- Scalable, containerized services
- Simplified
.env
management - CI/CD integration and logs out of the box
You’re one click away from running a complete, production-ready AI platform with support for multiple models, agents, and tools—all in one place.
🌐 Project Links
- 🔗 Project Repository: github.com/danny-avila/LibreChat
- 📘 Documentation: www.librechat.ai/docs
- 💬 Community Discord: discord.librechat.ai
Template Content
VectorDB 🗃️
pgvector/pgvector:pg16OPENAI_API_KEY
set a random value if you want to setup RAG later otherwise checkout https://www.librechat.ai/docs/configuration/rag_api
LibreChat 🪶
ghcr.io/danny-avila/librechat-dev:latestMeilisearch 🔍
getmeili/meilisearch:v1.11.3MongoDB 🍃
mongo