Deploy Breadcrumb
Simple, self-hosted LLM observability for TypeScript
breadcrumb
Just deployed
Just deployed
/var/lib/postgresql/data
ClickHouse
Just deployed
/var/lib/clickhouse
Deploy and Host Breadcrumb on Railway
Breadcrumb is an open-source LLM observability platform for TypeScript. It traces your AI agents and LLM pipelines, capturing prompts, completions, token counts, latency, and costs per call. Self-hosted, so your data stays on your infrastructure. Instrument your code in three lines with the TypeScript SDK.
About Hosting Breadcrumb
Breadcrumb runs three services: a Hono HTTP server that handles trace ingestion, authentication, and a tRPC API; a PostgreSQL database for projects and API keys; and a ClickHouse instance for high-volume trace and span analytics. The React dashboard is bundled with the server.
To deploy, you need to provision PostgreSQL and ClickHouse instances, set environment variables for database connections and a JWT secret, and run database migrations. Railway handles all three services in a single project, so you can connect them through internal networking without exposing databases to the public internet.
Common Use Cases
- Debug failing AI agents - Trace every step of a multi-step LLM pipeline to pinpoint where prompts go wrong, tokens spike, or responses degrade
- Track LLM costs across projects - Monitor token usage and cost per trace to catch expensive calls before they blow through your budget
- Audit prompt and completion history - Log every prompt sent and response received for compliance, QA, or fine-tuning dataset collection
Dependencies for Breadcrumb Hosting
- PostgreSQL 16 - Stores projects, API keys, user accounts, and session data
- ClickHouse 24 - Stores trace and span time-series data for fast analytical queries
- Node.js 20+ - Runtime for the Breadcrumb server
Deployment Dependencies
- Breadcrumb GitHub Repository
- Breadcrumb TypeScript SDK (
@breadcrumb-sdk/core) - ClickHouse on Railway
- PostgreSQL on Railway
Implementation Details
Once deployed, install the SDK in your application and start tracing:
import { Breadcrumb } from "@breadcrumb-sdk/core";
const bc = Breadcrumb.init({
apiKey: "your-api-key",
url: "https://your-breadcrumb-instance.railway.app",
});
const trace = bc.trace("my-agent");
const span = trace.span("llm-call", { model: "claude-sonnet-4-20250514" });
// ... your LLM call
span.end({ inputTokens: 150, outputTokens: 320 });
trace.end();
For Vercel AI SDK users, Breadcrumb provides a drop-in telemetry helper that captures generateText, streamText, and generateObject calls automatically.
Why Deploy Breadcrumb on Railway?
Railway is a singular platform to deploy your infrastructure stack. Railway will host your infrastructure so you don't have to deal with configuration, while allowing you to vertically and horizontally scale it.
By deploying Breadcrumb on Railway, you are one step closer to supporting a complete full-stack application with minimal burden. Host your servers, databases, AI agents, and more on Railway.
Template Content
breadcrumb
joshuaKnauber/breadcrumbENCRYPTION_KEY
Encrypts sensitive data. openssl rand -hex 32
BETTER_AUTH_SECRET
Signs auth session tokens. openssl rand -hex 32
ClickHouse
clickhouse/clickhouse-server:25.8