
Deploy OpenHuman
OpenHuman core with persistent memory and secure RPC. openhuman Open Human
openhuman-core
Just deployed
/home/openhuman/.openhuman
Deploy and Host OpenHuman on Railway
This Railway template deploys OpenHuman Core as a hosted, headless JSON-RPC service with a persistent Railway volume. After deployment, connect the OpenHuman desktop app to your Railway core using the generated RPC URL and token. This template does not deploy the full TinyHumans backend or a standalone web UI.
How to Use This Template
Deploy the template on Railway and wait for the OpenHuman Core service to become healthy. Copy the generated OPENHUMAN_CORE_RPC_URL and OPENHUMAN_CORE_TOKEN from the Railway service variables, then configure your OpenHuman desktop client to use that remote core. The expected RPC endpoint format is:
https://your-template-url/rpc
Before using the client, test the deployment with:
curl https://your-template-url/health
If the service is healthy but agent actions fail, check the Railway logs for workspace or permission errors. This template uses RAILWAY_RUN_UID=0 so the container can write to the mounted persistent volume.
What is OpenHuman?
OpenHuman is a personal AI assistant platform built around a local-first desktop experience and a Rust-based core. The core handles agent runtime, memory, skills, persistence, and JSON-RPC communication. This template hosts the core remotely while keeping its workspace persisted on Railway storage.
About Hosting OpenHuman
Hosting OpenHuman on Railway means running the openhuman-core service as a remote headless backend for the OpenHuman desktop app. Railway exposes the core over HTTPS, stores its workspace on a persistent volume, and protects the JSON-RPC API with a bearer token. The deployment is useful when you want your OpenHuman core to stay online beyond your local machine. Some OpenHuman features may still depend on the official TinyHumans backend for authentication, OAuth, or cloud services.
Common Use Cases
- Run OpenHuman Core as an always-on remote service for the desktop client.
- Persist OpenHuman memory, config, skills, and local state across redeploys.
- Test OpenHuman’s self-hosted core deployment without managing a VPS manually.
Dependencies for OpenHuman Hosting
- A Railway service running the OpenHuman Core container image.
- A Railway volume mounted at
/home/openhuman/.openhumanfor persistent storage. - A secure
OPENHUMAN_CORE_TOKENused by clients to authenticate JSON-RPC calls. - The OpenHuman desktop app configured to use the Railway RPC endpoint.
- The official TinyHumans backend URL for OpenHuman cloud, auth, and OAuth flows.
Deployment Dependencies
- OpenHuman GitHub repository: https://github.com/tinyhumansai/openhuman
- OpenHuman documentation: https://tinyhumans.gitbook.io/openhuman
Implementation Details
Recommended Railway variables:
BACKEND_URL="https://api.tinyhumans.ai" # official TinyHumans backend used by OpenHuman for auth, OAuth, and cloud services
RAILWAY_RUN_UID="0" # runs the container as root to avoid Permission denied errors on the Railway volume
OPENHUMAN_WORKSPACE="/home/openhuman/.openhuman" # persistent core workspace: config, SQLite databases, memory, skills, and local state
OPENHUMAN_CORE_TOKEN="${{ secret(32) }}" # Railway-generated secret used as the Bearer token to protect the core JSON-RPC API
OPENHUMAN_CORE_RPC_URL="https://${{RAILWAY_PUBLIC_DOMAIN}}/rpc" # public JSON-RPC endpoint URL for clients connecting to this Railway core
Recommended volume mount path:
/home/openhuman/.openhuman
Why Deploy OpenHuman on Railway?
Railway is a singular platform to deploy your infrastructure stack. Railway will host your infrastructure so you don't have to deal with configuration, while allowing you to vertically and horizontally scale it.
By deploying OpenHuman on Railway, you are one step closer to supporting a complete full-stack application with minimal burden. Host your servers, databases, AI agents, and more on Railway.
Template Content
openhuman-core
ghcr.io/tinyhumansai/openhuman-core:latest