n8n-mcp
Deploy n8n-mcp to enable AI agents to build and edit n8n workflows
czlonkowski/n8n-mcp-railway:latest
czlonkowski/n8n-mcp-railway:latest
Just deployed
Deploy and Host n8n-mcp on Railway
n8n-mcp is a Model Context Protocol (MCP) server that bridges n8n's 525+ workflow automation nodes with AI assistants like Claude, enabling them to understand, design, build, and validate n8n workflows with deep knowledge of node properties, operations, and best practices.
š Source Code: github.com/czlonkowski/n8n-mcp
About Hosting n8n-mcp
Hosting n8n-mcp involves deploying a lightweight, stateless HTTP server with a pre-built SQLite database containing comprehensive n8n node documentation. The server requires no runtime dependencies and runs in a Docker container optimized for minimal resource usage (~280MB, 82% smaller than typical n8n images). It provides Bearer token authentication for security and can optionally connect to your n8n instance API for workflow management capabilities. The deployment supports connections from various AI-powered development environments including Claude Desktop, Claude Code, Windsurf, Cursor, and VS Code.
Common Use Cases
- AI-Assisted Workflow Development: Enable AI assistants across multiple IDEs to accurately design and build n8n workflows with validated node configurations
- Workflow Validation as a Service: Provide real-time validation of node configurations and complete workflows before deployment, preventing runtime errors
- Centralized n8n Knowledge Base: Deploy once and connect from anywhere - share n8n expertise across your team through various AI-powered development tools
Dependencies for n8n-mcp Hosting
- No runtime dependencies required - The server is self-contained with all necessary components
- Pre-built SQLite database - Included in the Docker image with complete n8n node information
Deployment Dependencies
- Claude Desktop, Claude Code, or other MCP-compatible AI assistants
- mcp-remote - NPM package for connecting to HTTP MCP servers
- n8n Instance (optional) - For workflow management features, requires API access
- Docker - Container runtime (handled by Railway)
Implementation Details
The Railway template uses a pre-built Docker image (ghcr.io/czlonkowski/n8n-mcp:latest
) that includes:
- Complete n8n node database (528 nodes, 90% documentation coverage)
- Optimized HTTP server with MCP protocol support
- Built-in security with Bearer token authentication
- Health checks and monitoring endpoints
Key environment variables for Railway deployment:
# Security - MUST be changed from default
AUTH_TOKEN=your-secure-token-here # Generate with: openssl rand -base64 32
# Pre-configured for Railway (do not change)
MCP_MODE=http
USE_FIXED_HTTP=true
NODE_ENV=production
LOG_LEVEL=info
TRUST_PROXY=1
CORS_ORIGIN=*
HOST=0.0.0.0
# Optional - Enable n8n workflow management
N8N_API_URL=https://your-n8n-instance.com
N8N_API_KEY=your-n8n-api-key
Connect from any MCP-compatible client:
{
"mcpServers": {
"n8n-railway": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"https://your-app.up.railway.app/mcp",
"--header",
"Authorization: Bearer YOUR_TOKEN"
]
}
}
}
Documentation & Resources
- Full Documentation: github.com/czlonkowski/n8n-mcp
- Railway Deployment Guide: Railway-specific setup instructions
License & Attribution
n8n-mcp is released under the MIT License by Romuald Czlonkowski @ www.aiadvisors.pl/en.
While not required, attribution is appreciated:
Built with [n8n-MCP](https://github.com/czlonkowski/n8n-mcp) by Romuald Czlonkowski
Why Deploy n8n-mcp on Railway?
Railway is a singular platform to deploy your infrastructure stack. Railway will host your infrastructure so you don't have to deal with configuration, while allowing you to vertically and horizontally scale it.
By deploying n8n-mcp on Railway, you are one step closer to supporting a complete full-stack application with minimal burden. Host your servers, databases, AI agents, and more on Railway.
Template Content
czlonkowski/n8n-mcp-railway:latest
ghcr.io/czlonkowski/n8n-mcp-railway:latest