Railway

Deploy Coworker

Array Ventures: Coworker Agent

Deploy Coworker

Just deployed

/var/lib/postgresql

Just deployed

Just deployed

Just deployed

/data

Just deployed

/data

Deploy and Host Coworker on Railway

Coworker is an open-source AI agent built with Mastra. It acts as an AI team member that handles tasks, answers questions, and manages workflows via chat. Supports OpenAI, Anthropic, Google Gemini, NVIDIA, Groq, and Kimi. Includes A2A protocol, MCP server/client with UI, Lovable-like app builder, and a skills marketplace. An open-source alternative to OpenClaw.

About Hosting Coworker

This template deploys a complete Coworker stack: the Mastra server, Inngest for scheduled task workflows, Postgres and Redis for Inngest's production mode, and an optional Tailscale subnet router for private network access. All services are pre-configured with Railway's private networking so they communicate securely without public exposure. After deploying, set your preferred AI model and provider API key. Download the Coworker desktop app for macOS and point it at your Railway server URL.

Common Use Cases

  • AI chat assistant — answer questions, draft content, summarize documents, and brainstorm ideas
  • Scheduled task automation — run recurring AI workflows on a cron schedule via Inngest
  • App builder — Lovable-like builder for creating internal dashboards and tools, maintained by agents with git version control
  • MCP UI — visual interface for managing MCP servers and building agent-maintained internal dashboards
  • Skills marketplace — install community-built skills from ClawHub and skills.sh
  • MCP registry — discover and install MCP servers from the built-in registry
  • A2A (Agent-to-Agent) protocol — let other AI agents discover and communicate with Coworker
  • Google Workspace integration — manage emails, calendar, and docs through natural language
  • WhatsApp bridge — interact with your AI agent via WhatsApp messages
  • Multi-provider AI — switch between OpenAI, Anthropic, Google, NVIDIA, Groq, Kimi, or any OpenAI-compatible endpoint

Dependencies for Coworker Hosting

  • An API key for your chosen AI provider (OpenAI, Anthropic, NVIDIA, Google, Groq, or Kimi)
  • A Tailscale account and auth key (optional, for private network access)

Deployment Dependencies

Implementation Details

Coworker runs as a Docker container from ghcr.io/array-ventures/coworker:latest. The Mastra server exposes a REST API with built-in routes for A2A protocol (/api/a2a/coworker), agent card discovery (/api/.well-known/coworker/agent-card.json), and MCP server endpoints. Inngest connects to the Coworker SDK endpoint via Railway's private network for workflow orchestration. All inter-service communication uses Railway's internal fd12::/16 network.

Why Deploy Coworker on Railway?

Railway is a singular platform to deploy your infrastructure stack. Railway will host your infrastructure so you don't have to deal with configuration, while allowing you to vertically and horizontally scale it.

By deploying Coworker on Railway, you are one step closer to supporting a complete full-stack application with minimal burden. Host your servers, databases, AI agents, and more on Railway.


Template Content

More templates in this category

View Template
Chat Chat
Chat Chat, your own unified chat and search to AI platform.

okisdev
View Template
openui
Deploy OpenUI: AI-powered UI generation with GitHub OAuth and OpenAI API.

zexd
View Template
firecrawl
firecrawl api server + worker without auth, works with dify

Rama