Railway

Deploy openclaw with native onboard

openclaw for Railway with a minimal reverse proxy wrapper.

Deploy openclaw with native onboard

openclaw-railway-template

LaceLetho/openclaw-railway-template

Just deployed

/data

Deploy and Host openclaw with native onboard on Railway

OpenClaw is a personal AI assistant with a web-based Control UI and gateway. The "native onboard" variant deploys it on Railway inside a Docker container with a lightweight Express reverse proxy, and lets you initialize the gateway yourself via Railway SSH — giving you full control over configuration.

About Hosting openclaw with native onboard

Hosting OpenClaw with native onboard on Railway involves deploying a Docker container that wraps the OpenClaw gateway with a minimal Express reverse proxy. All routes are protected with HTTP Basic Auth via a PASSWORD environment variable. After the first deploy, you SSH into the container to run openclaw onboard once — configuring the workspace, gateway port, auth token, and allowed origins. On every subsequent restart, the wrapper automatically detects the existing config and starts the gateway, proxying all traffic (including WebSockets) to it. A Railway Volume at /data provides persistent storage for config, credentials, and AI memory across redeploys.

Common Use Cases

  • Running a personal, always-on AI assistant accessible from anywhere through a secure authenticated web gateway
  • Self-hosting an AI coding assistant with full control over your data and configuration
  • Deploying a private OpenClaw instance for research, automation, or team use without relying on third-party cloud AI services

Dependencies for openclaw with native onboard Hosting

  • Railway Volume — must be mounted at /data to persist OpenClaw state, credentials, and workspace across redeploys
  • OpenClaw — built from source (GitHub) during the Docker image build stage using pnpm and bun

Deployment Dependencies

  • OpenClaw — the AI assistant built and embedded in the Docker image
  • Railway Volumes — required for persistent state storage
  • Railway SSH — used for the one-time openclaw onboard initialization step

Implementation Details

After first deploy, SSH in and run the one-time onboard command:

railway ssh --project= --service= --environment=

openclaw onboard \
  --workspace /data/workspace \
  --gateway-bind loopback \
  --gateway-port 18789 \
  --gateway-auth token \
  --gateway-token "$OPENCLAW_GATEWAY_TOKEN" \
  --no-install-daemon

openclaw config set --json gateway.controlUi.allowedOrigins '["https://.up.railway.app"]'

## Why Deploy openclaw with native onboard on Railway?
Railway is a singular platform to deploy your infrastructure stack. Railway will host your infrastructure so you don't have to deal with configuration, while allowing you to vertically and horizontally scale it.

By deploying openclaw with native onboard on Railway, you are one step closer to supporting a complete full-stack application with minimal burden. Host your servers, databases, AI agents, and more on Railway.

Template Content

openclaw-railway-template

LaceLetho/openclaw-railway-template

More templates in this category

View Template
Chat Chat
Chat Chat, your own unified chat and search to AI platform.

okisdev
View Template
openui
Deploy OpenUI: AI-powered UI generation with GitHub OAuth and OpenAI API.

zexd
View Template
firecrawl
firecrawl api server + worker without auth, works with dify

Rama