Railway

Deploy Everruns

Deploy the Everruns OSS agent platform on Railway.

Deploy Everruns

Just deployed

/data

Just deployed

/data

Just deployed

/var/lib/postgresql/data

Deploy and Host Everruns

Everruns is an OSS durable agent platform for running AI agents with a control plane, worker pool, browser UI, and persistent storage.

About Hosting Everruns

This template deploys a complete test stack on Railway. Caddy is the only public service and routes browser traffic, API traffic, OAuth/MCP endpoints, and health checks to the correct internal service. The server, worker, UI, Postgres, Redis, and NATS services stay on Railway private networking.

Why Deploy Everruns

Use this template when you want a fast test environment for Everruns without hand-translating the Docker Compose topology into Railway services.

Common Use Cases

  • Test Everruns from a public HTTPS URL
  • Run durable agent sessions with a worker process
  • Exercise API, UI, MCP, and OAuth routes behind one ingress
  • Validate Railway deployment behavior before building a production setup

Dependencies for Everruns

This template provisions the required backing services and runtime processes.

Deployment Dependencies

  • PostgreSQL for durable storage
  • Redis for distributed rate limiting
  • NATS with JetStream for event delivery and worker notifications
  • Everruns server/control plane
  • Everruns worker
  • Everruns UI
  • Caddy public ingress

Required variables during deployment:

  • AUTH_ADMIN_EMAIL: email for the single admin account
  • AUTH_ADMIN_PASSWORD: password for the admin account
  • AUTH_JWT_SECRET: random signing secret, 32+ characters
  • WORKER_GRPC_AUTH_TOKEN: shared random token for server-to-worker gRPC auth
  • SECRETS_ENCRYPTION_KEY: kek-v1: plus 32 random bytes encoded as standard base64

Generate the encryption key locally with:

python3 -c "import os, base64; print('kek-v1:' + base64.b64encode(os.urandom(32)).decode())"

After deployment, open the generated Caddy domain and sign in with the admin credentials. Add LLM provider keys in Settings > Providers.


Template Content

More templates in this category

View Template
Chat Chat
Chat Chat, your own unified chat and search to AI platform.

okisdev
View Template
Hermes Agent | OpenClaw Alternative with Dashboard
Self-improving AI agent with memory, skills, and web dashboard 🤖

codestorm
View Template
EchoDeck
Generate a mp4 from powerpoint with TTS

Fixed Scope