Deploy OpenTulpa AI Employee
Self-hosted Telegram AI employee with durable memory.
opentulpa
Just deployed
/app/opentulpa_data
Deploy and Host OpenTulpa AI Employee
Deploy OpenTulpa as a self-hosted AI employee you operate from Telegram. The service runs from the public kvyb/opentulpa repository, persists state on a Railway volume, and uses an OpenAI-compatible model provider.
About Hosting OpenTulpa AI Employee
OpenTulpa is a FastAPI and LangGraph-based agent runtime for work that repeats. It keeps durable memory, workflow state, checkpoints, behavior logs, and generated artifacts on disk. This Railway template mounts persistent storage at /app/opentulpa_data and starts the server with ./start.sh run server.
The deployed service exposes health checks at /healthz and /agent/healthz. When Telegram variables are configured, OpenTulpa registers the Telegram webhook from the public Railway domain during startup.
Why Deploy OpenTulpa AI Employee
Deploy OpenTulpa when you want an inspectable AI operator running on infrastructure you control. It is useful for owner-operated Telegram workflows, customer intake over Telegram Business, recurring routines, app-connected tasks, and browser workflows that need durable state across sessions.
Common Use Cases
- Personal operator in Telegram for research, reports, scripting, reminders, and follow-ups.
- Telegram Business intake for lead qualification, appointment collection, and handoff/escalation.
- Workflow employee that remembers source material, preferences, customer state, and prior decisions.
- Composio-connected tasks for Google Sheets, Gmail, Slack, Notion, HubSpot, Instagram, and other SaaS tools.
- Browser Use Cloud workflows with live login handoff and persistent profiles.
Dependencies for OpenTulpa AI Employee Hosting
You need these before deploying:
- A Railway account.
- A Telegram bot token from
@BotFather. - Your numeric Telegram user ID from a helper bot such as
@userinfobotor@RawDataBot. - An OpenAI-compatible model provider API key, for example OpenRouter.
Deployment Dependencies
Required variables:
| Variable | What to enter |
|---|---|
OPENAI_COMPATIBLE_API_KEY | API key for your OpenAI-compatible model provider. |
OPENAI_COMPATIBLE_BASE_URL | Provider base URL, for example https://openrouter.ai/api/v1. |
TELEGRAM_BOT_TOKEN | Telegram bot token from BotFather. |
TELEGRAM_ALLOWED_USER_IDS | Comma-separated numeric Telegram user IDs allowed to operate the employee. Use IDs, not usernames. |
TELEGRAM_WEBHOOK_SECRET | Webhook validation secret. Use a generated secret value. |
PUBLIC_BASE_URL | Public service URL. Use the generated Railway HTTPS domain. |
OPENTULPA_DATA_ROOT | Persistent storage root. Use /app/opentulpa_data. |
Optional variables after first deploy:
| Variable | Enables |
|---|---|
COMPOSIO_API_KEY | Google Sheets, Gmail, Slack, Notion, HubSpot, Instagram, and other Composio Tool Router integrations. |
BROWSER_USE_API_KEY | Browser Use Cloud sessions with live view and persistent profiles. |
CAPSOLVER_API_KEY | CAPTCHA solving for supported browser workflows. |
LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY, LANGFUSE_BASE_URL | Langfuse tracing for turns, tools, model calls, and side-effect summaries. |
After deploy, verify /healthz and /agent/healthz, then message your Telegram bot from an allowed numeric Telegram user ID.
Template Content
opentulpa
kvyb/opentulpaTELEGRAM_BOT_TOKEN
OPENTULPA_DATA_ROOT
TELEGRAM_WEBHOOK_SECRET
OPENAI_COMPATIBLE_API_KEY
TELEGRAM_ALLOWED_USER_IDS
OPENAI_COMPATIBLE_BASE_URL

