Deploy Clawdbot
Self-hosted AI agent gateway with web UI. Any LLM provider.
clawdbot-railway
Just deployed
/data
Deploy and Host Clawdbot on Railway
Clawdbot is a self-hosted AI agent gateway that connects any LLM provider — Anthropic, OpenAI, MiniMax, and more — to a unified web dashboard. It includes chat, agent management, persistent memory, a file workspace, and support for messaging channels like Telegram, Discord, WhatsApp, and Slack.
About Hosting Clawdbot
Clawdbot runs as a Node.js gateway inside a Docker container. This template deploys it with a persistent volume so your config, agent memories, and workspace files survive redeployments. A minimal gateway config is baked into the Docker image and automatically seeded to the volume on first boot. After deploy, you configure your API keys and model provider entirely through the web-based Control UI — no SSH, no manual JSON editing, no environment variable juggling. The gateway supports hot-reloading, so config changes apply without downtime.
Setup After Deploy
- Open the Control UI — go to your Railway-generated domain URL
- Connect with your token — on the Gateway Access page, enter the
CLAWDBOT_GATEWAY_TOKENfrom your Railway Variables tab and click Connect - Set your API key — in the Control UI, go to Config and add your Anthropic API key
- Set your workspace — in Config, set the agent workspace to
/data/clawdso files persist on the volume - Start chatting — go to the Chat tab and send a message
All configuration is done through the Control UI. The default model is Claude Opus 4.5 — no extra setup needed.
Common Use Cases
- Run a personal AI assistant accessible from any browser or messaging app
- Host a multi-model gateway that routes between Claude, GPT, MiniMax, or any LLM provider
- Deploy a persistent AI agent with its own workspace, memory, and file storage
- Connect AI to Telegram, Discord, WhatsApp, or Slack from a single gateway
Dependencies for Clawdbot Hosting
- An API key from at least one LLM provider (e.g. Anthropic, OpenAI, MiniMax)
- A Railway account with a volume for persistent storage
Deployment Dependencies
Implementation Details
The Dockerfile uses an inline ENTRYPOINT to seed config on first boot:
- A default gateway config is baked into the image at build time
- On container start, the ENTRYPOINT checks if a config file exists on the volume
- If not, it copies the baked default to the volume, then starts the gateway
- Subsequent deploys reuse the existing volume config, preserving all your settings
The CLAWDBOT_STATE_DIR environment variable points the gateway to the volume-backed directory. The CLAWDBOT_GATEWAY_TOKEN secures access to the Control UI.
Why Deploy Clawdbot on Railway?
Railway is a singular platform to deploy your infrastructure stack. Railway will host your infrastructure so you don't have to deal with configuration, while allowing you to vertically and horizontally scale it.
By deploying Clawdbot on Railway, you are one step closer to supporting a complete full-stack application with minimal burden. Host your servers, databases, AI agents, and more on Railway.
Template Content
clawdbot-railway
namanxajmera/clawdbot-railwayCLAWDBOT_GATEWAY_TOKEN
Port the gateway listens on.

