Deploy LobeChat | Multi-Model AI/LLM Support
Railway

Deploy LobeChat | Multi-Model AI/LLM Support

Self Host Lobechat. Private AI chat with 40+ providers, RAG, plugins & more

Deploy LobeChat | Multi-Model AI/LLM Support

Just deployed

/var/lib/postgresql/data

RustFS Init

alpine:latest

Just deployed

Just deployed

/data

Just deployed

Just deployed

/data

LobeChat logo

Deploy and Host LobeChat on Railway

Deploy on Railway

Deploy LobeChat on Railway to run your own private AI chat platform with multi-model support, knowledge bases, and plugin extensibility. Self-host LobeChat with full control over your data and AI provider keys — no vendor lock-in.

This Railway template deploys a production-ready LobeChat stack with PostgreSQL (via ParadeDB with pgvector), Redis for caching and sessions, and RustFS for S3-compatible file storage. All services are pre-configured with secure inter-service networking and persistent volumes.

Getting Started with LobeChat on Railway

After Railway finishes deploying all five services, open your LobeChat URL to reach the sign-in page. LobeChat 2.0 uses Better Auth — register with an email and password to create your first account. Once logged in, navigate to Settings > Model Provider and add your API key for at least one provider (OpenAI, Anthropic Claude, Google Gemini, or any of the 40+ supported providers). Send your first message to verify the connection works.

To enable file uploads and the knowledge base, your RustFS storage is already configured. Upload a document in any chat to test RAG capabilities. Explore the Plugin Store and Agent Marketplace to extend functionality beyond basic chat.

About Hosting LobeChat

LobeChat is an open-source AI chat framework with 70,000+ GitHub stars, built with Next.js. It serves as a private alternative to ChatGPT with full multi-model support.

  • Multi-Provider AI: Connect 40+ providers — OpenAI, Claude, Gemini, DeepSeek, Ollama, AWS Bedrock, Azure, Mistral, Groq, and more
  • Knowledge Base (RAG): Upload documents, build searchable knowledge bases with pgvector-powered retrieval
  • Plugin Ecosystem: 10,000+ skills via Function Calling, MCP marketplace integration
  • Multi-Modal: Vision (image recognition), text-to-speech, speech-to-text, image generation
  • Better Auth: Built-in authentication with email/password, OAuth, SSO support
  • Artifacts & Thinking: Advanced reasoning visualization and interactive code artifacts

Why Deploy LobeChat on Railway

  • One-click deployment with PostgreSQL, Redis, and S3 storage pre-configured
  • Private networking between services — database never exposed to the internet
  • Persistent volumes for database and file storage survive redeploys
  • Scale services independently — add more RAM to LobeChat without touching the database
  • No Docker knowledge required — Railway handles containers, networking, and TLS

Common Use Cases for Self-Hosted LobeChat

  • Team AI Assistant: Deploy a shared ChatGPT alternative with centralized API key management and user accounts
  • RAG-Powered Knowledge Base: Upload internal docs and build a searchable AI assistant for your organization
  • Multi-Model Testing: Compare responses from GPT-4, Claude, Gemini, and open-source models side-by-side
  • Custom Agent Development: Build specialized AI agents with plugins, tools, and MCP integrations for domain-specific workflows

Dependencies for LobeChat on Railway

  • LobeChatlobehub/lobehub — Main application (Next.js 16, port 8080)
  • PostgreSQLparadedb/paradedb:latest-pg17 — Database with pgvector extension for RAG
  • Redis — Railway managed — Session storage, caching, rate limiting
  • RustFSrustfs/rustfs:latest — S3-compatible object storage for file uploads
  • RustFS Initalpine:latest — One-shot bucket initialization using minio/mc

Environment Variables Reference for LobeChat

VariableServiceDescription
DATABASE_URLLobeChatPostgreSQL connection string
REDIS_URLLobeChatRedis connection for sessions/cache
S3_ENDPOINTLobeChatRustFS public URL for file storage
S3_BUCKETLobeChatS3 bucket name (lobe)
KEY_VAULTS_SECRETLobeChatEncryption key for user API key vaults
AUTH_SECRETLobeChatSession token signing secret
APP_URLLobeChatPublic-facing application URL
POSTGRES_DBPostgreSQLDatabase name
POSTGRES_PASSWORDPostgreSQLDatabase password
PGDATAPostgreSQLData directory (subdirectory of volume)
RUSTFS_ACCESS_KEYRustFSS3 access key
RUSTFS_SECRET_KEYRustFSS3 secret key

Deployment Dependencies

Hardware Requirements for Self-Hosting LobeChat

ResourceMinimumRecommended
CPU1 vCPU2 vCPU
RAM1 GB (app) + 512 MB (PostgreSQL)2 GB (app) + 1 GB (PostgreSQL)
Storage2 GB (app) + 5 GB (database)10 GB (database) + 20 GB (files)
RuntimeDocker 20+ or Node.js 20+Docker Compose recommended

RAM scales with concurrent users and knowledge base size. PostgreSQL with pgvector benefits from additional memory for vector index caching.

Self-Hosting LobeChat with Docker Compose

The fastest way to self-host LobeChat is with Docker Compose. Clone the official repository and use the deploy configuration:

git clone https://github.com/lobehub/lobe-chat.git
cd lobe-chat/docker-compose/deploy
cp .env.example .env
# Edit .env — set KEY_VAULTS_SECRET, AUTH_SECRET, POSTGRES_PASSWORD
bash setup.sh  # generates secrets automatically
docker compose up -d

For a minimal single-service setup without persistent storage:

docker run -d \
  -p 3210:8080 \
  -e KEY_VAULTS_SECRET="$(openssl rand -base64 32)" \
  -e AUTH_SECRET="$(openssl rand -base64 32)" \
  -e DATABASE_URL="postgresql://user:pass@host:5432/lobechat" \
  lobehub/lobehub

How Much Does LobeChat Cost to Self-Host?

LobeChat is free and open-source under the Apache 2.0 license. There are no paid tiers for the self-hosted version — you get the full feature set. The only costs are infrastructure (server, database, storage) and AI provider API keys. On Railway, expect approximately $5–15/month for the full stack depending on usage. LobeHub also offers a managed cloud service with credit-based pricing for teams that prefer not to self-host.

LobeChat vs Open WebUI vs LibreChat

FeatureLobeChatOpen WebUILibreChat
LicenseApache 2.0MITMIT
AI Providers40+ (API-based)Ollama + OpenAI20+
Knowledge Base (RAG)Built-in with pgvectorBuilt-inPlugin-based
Plugin System10,000+ skillsPipelinesPlugins
AuthBetter Auth (SSO, OAuth)Built-inOAuth
Local LLMsVia Ollama integrationNative OllamaVia Ollama
GitHub Stars70k+80k+20k+

LobeChat excels at multi-provider API-based workflows with the richest plugin ecosystem. Open WebUI is the stronger choice for local LLM-first deployments with Ollama.

FAQ

What is LobeChat and why self-host it? LobeChat is an open-source AI chat framework that lets you build a private ChatGPT-like interface. Self-hosting gives you full control over data privacy, API key management, and the ability to connect any AI provider without third-party intermediaries.

What does this Railway template deploy for LobeChat? This template deploys five services: the LobeChat application (lobehub/lobehub), PostgreSQL with pgvector for database and vector search, Railway-managed Redis for caching, RustFS for S3-compatible file storage, and a one-shot init service that creates the storage bucket.

Why does LobeChat need PostgreSQL with pgvector on Railway? PostgreSQL stores user accounts, chat history, settings, and plugin configurations. The pgvector extension enables the Knowledge Base feature — when you upload documents, LobeChat creates vector embeddings and stores them in PostgreSQL for semantic search (RAG).

How do I add my OpenAI or Claude API key to self-hosted LobeChat? After logging in, go to Settings > Model Provider and enter your API key for each provider. LobeChat encrypts stored keys using the KEY_VAULTS_SECRET environment variable. You can add keys for OpenAI, Anthropic Claude, Google Gemini, and 40+ other providers simultaneously.

Can I use LobeChat with local LLMs like Ollama on Railway? LobeChat supports Ollama as a model provider, but running Ollama requires GPU access which Railway does not currently provide. You can connect LobeChat on Railway to an Ollama instance running on a separate GPU server by setting the Ollama API endpoint in the model provider settings.

How do I enable file uploads and knowledge base in self-hosted LobeChat? This template pre-configures RustFS as S3-compatible storage. File uploads and knowledge base features work out of the box — upload documents in any chat conversation or create dedicated knowledge bases from the sidebar.


Template Content

More templates in this category

View Template
Chat Chat
Chat Chat, your own unified chat and search to AI platform.

okisdev
View Template
EchoDeck
Generate a mp4 from powerpoint with TTS

Fixed Scope
View Template
NEW
Rift
Rift Its a OSS AI Chat for teams

Compound