Railway

Deploy OpenTulpa AI Employee

Self-hosted Telegram AI employee with durable memory.

Deploy OpenTulpa AI Employee

Just deployed

/app/opentulpa_data

Deploy and Host OpenTulpa AI Employee

Deploy OpenTulpa as a self-hosted AI employee you operate from Telegram. The service runs from the public kvyb/opentulpa repository, persists state on a Railway volume, and uses an OpenAI-compatible model provider.

About Hosting OpenTulpa AI Employee

OpenTulpa is a FastAPI and LangGraph-based agent runtime for work that repeats. It keeps durable memory, workflow state, checkpoints, behavior logs, and generated artifacts on disk. This Railway template mounts persistent storage at /app/opentulpa_data and starts the server with ./start.sh run server.

The deployed service exposes health checks at /healthz and /agent/healthz. When Telegram variables are configured, OpenTulpa registers the Telegram webhook from the public Railway domain during startup.

Why Deploy OpenTulpa AI Employee

Deploy OpenTulpa when you want an inspectable AI operator running on infrastructure you control. It is useful for owner-operated Telegram workflows, customer intake over Telegram Business, recurring routines, app-connected tasks, and browser workflows that need durable state across sessions.

Common Use Cases

  • Personal operator in Telegram for research, reports, scripting, reminders, and follow-ups.
  • Telegram Business intake for lead qualification, appointment collection, and handoff/escalation.
  • Workflow employee that remembers source material, preferences, customer state, and prior decisions.
  • Composio-connected tasks for Google Sheets, Gmail, Slack, Notion, HubSpot, Instagram, and other SaaS tools.
  • Browser Use Cloud workflows with live login handoff and persistent profiles.

Dependencies for OpenTulpa AI Employee Hosting

You need these before deploying:

  • A Railway account.
  • A Telegram bot token from @BotFather.
  • Your numeric Telegram user ID from a helper bot such as @userinfobot or @RawDataBot.
  • An OpenAI-compatible model provider API key, for example OpenRouter.

Deployment Dependencies

Required variables:

VariableWhat to enter
OPENAI_COMPATIBLE_API_KEYAPI key for your OpenAI-compatible model provider.
OPENAI_COMPATIBLE_BASE_URLProvider base URL, for example https://openrouter.ai/api/v1.
TELEGRAM_BOT_TOKENTelegram bot token from BotFather.
TELEGRAM_ALLOWED_USER_IDSComma-separated numeric Telegram user IDs allowed to operate the employee. Use IDs, not usernames.
TELEGRAM_WEBHOOK_SECRETWebhook validation secret. Use a generated secret value.
PUBLIC_BASE_URLPublic service URL. Use the generated Railway HTTPS domain.
OPENTULPA_DATA_ROOTPersistent storage root. Use /app/opentulpa_data.

Optional variables after first deploy:

VariableEnables
COMPOSIO_API_KEYGoogle Sheets, Gmail, Slack, Notion, HubSpot, Instagram, and other Composio Tool Router integrations.
BROWSER_USE_API_KEYBrowser Use Cloud sessions with live view and persistent profiles.
CAPSOLVER_API_KEYCAPTCHA solving for supported browser workflows.
LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY, LANGFUSE_BASE_URLLangfuse tracing for turns, tools, model calls, and side-effect summaries.

After deploy, verify /healthz and /agent/healthz, then message your Telegram bot from an allowed numeric Telegram user ID.


Template Content

More templates in this category

View Template
Telegram JavaScript Bot
A template for Telegram bot in JavaScript using grammY

Agampreet Singh
View Template
Cobalt Tools [Updated May ’26]
Cobalt Tools [May ’26] (Media Downloader, Converter & Automation) Self Host

shinyduo
View Template
Whatsmiau
Deploy Whatsmiau on Railway. WhatsApp REST API. One click.

Douglas Rubim