Railway

Deploy Fabric

Fabric framework API with bundled AI prompt patterns and persistent config

Deploy Fabric

/home/appuser/.config/fabric

Deploy and Host Fabric on Railway

Fabric is an open-source framework for augmenting work with AI through reusable prompts called patterns. This template deploys Fabric as a hosted REST API, so apps, automations, and agents can call Fabric patterns, chat completions, model discovery, sessions, contexts, and helper tools over HTTP.

About Hosting Fabric

This Railway template builds Fabric from source using a custom Dockerfile, starts the Fabric REST API, and exposes it behind a Railway public domain. It includes bundled Fabric patterns and strategies, Swagger API docs, /health health checks, and API-key protection through the X-API-Key header. FABRIC_API_KEY is preconfigured by the template as ${{ secret(32) }}, so Railway generates a strong API secret automatically. For persistent usage, attach or keep the Railway volume at /home/appuser/.config/fabric; this keeps config, sessions, contexts, patterns, and strategies across redeploys. Before first use, set DEFAULT_VENDOR, DEFAULT_MODEL, and one provider API key such as OPENROUTER_API_KEY, OPENAI_API_KEY, ANTHROPIC_API_KEY, or GEMINI_API_KEY.

Common Use Cases

  • Run Fabric patterns from apps, agents, scripts, or workflow automations.
  • Host a reusable AI prompt API for summarization, extraction, analysis, and text transformation.
  • Deploy one Fabric endpoint that can use OpenRouter, OpenAI, Anthropic, Gemini, and many other providers.

Dependencies for Fabric Hosting

  • Railway account and project.
  • FABRIC_API_KEY, generated automatically by the template with ${{ secret(32) }}.
  • One AI provider API key matching DEFAULT_VENDOR.
  • Required user-selected variables: DEFAULT_VENDOR, DEFAULT_MODEL, and the matching provider key.
  • Railway volume mounted at /home/appuser/.config/fabric.
  • Public Railway domain for external API access.

Provider Variable Options

DEFAULT_VENDOR should use one of these provider names. Set only the matching credential variables for the provider you choose.

OpenAI-compatible providers:

DEFAULT_VENDORRequired variablesOptional variables
AbacusABACUS_API_KEYABACUS_API_BASE_URL
AIMLAIML_API_KEYAIML_API_BASE_URL
CerebrasCEREBRAS_API_KEYCEREBRAS_API_BASE_URL
DeepSeekDEEPSEEK_API_KEYDEEPSEEK_API_BASE_URL
GitHubGITHUB_API_KEYGITHUB_API_BASE_URL
GrokAIGROKAI_API_KEYGROKAI_API_BASE_URL
GroqGROQ_API_KEYGROQ_API_BASE_URL
InfermaticINFERMATIC_API_KEYINFERMATIC_API_BASE_URL
LangdockLANGDOCK_API_KEYLANGDOCK_API_BASE_URL, LANGDOCK_REGION
LiteLLMLITELLM_API_KEYLITELLM_API_BASE_URL
MammouthMAMMOUTH_API_KEYMAMMOUTH_API_BASE_URL
MiniMaxMINIMAX_API_KEYMINIMAX_API_BASE_URL
MistralMISTRAL_API_KEYMISTRAL_API_BASE_URL
Novita AINOVITA_AI_API_KEYNOVITA_AI_API_BASE_URL
OpenRouterOPENROUTER_API_KEYOPENROUTER_API_BASE_URL
SiliconCloudSILICONCLOUD_API_KEYSILICONCLOUD_API_BASE_URL
TogetherTOGETHER_API_KEYTOGETHER_API_BASE_URL
Venice AIVENICE_AI_API_KEYVENICE_AI_API_BASE_URL
Z AIZ_AI_API_KEYZ_AI_API_BASE_URL

Native and special providers:

DEFAULT_VENDORRequired variablesOptional variables
AnthropicANTHROPIC_API_KEYANTHROPIC_API_BASE_URL
AzureAZURE_API_KEY, AZURE_API_BASE_URL, AZURE_DEPLOYMENTSAZURE_API_VERSION
AzureAIGatewayAZUREAIGATEWAY_BACKEND, AZUREAIGATEWAY_GATEWAY_URL, AZUREAIGATEWAY_SUBSCRIPTION_KEYAZUREAIGATEWAY_API_VERSION
AzureEntraAZUREENTRA_API_BASE_URL, AZUREENTRA_DEPLOYMENTSAZUREENTRA_API_VERSION; also provide Azure identity env such as AZURE_CLIENT_ID, AZURE_CLIENT_SECRET, AZURE_TENANT_ID
BedrockBEDROCK_AWS_REGIONBEDROCK_API_KEY, BEDROCK_AWS_ACCESS_KEY_ID, BEDROCK_AWS_SECRET_ACCESS_KEY; AWS credential-chain env also works
CodexCODEX_REFRESH_TOKEN, CODEX_ACCOUNT_IDCODEX_ACCESS_TOKEN, CODEX_BASE_URL, CODEX_AUTH_BASE_URL
CopilotCOPILOT_TENANT_ID, COPILOT_CLIENT_ID, plus either COPILOT_ACCESS_TOKEN or COPILOT_REFRESH_TOKEN with COPILOT_CLIENT_SECRETCOPILOT_API_BASE_URL, COPILOT_TIME_ZONE
DigitalOceanDIGITALOCEAN_INFERENCE_KEYDIGITALOCEAN_INFERENCE_BASE_URL, DIGITALOCEAN_TOKEN
ExolabEXOLAB_API_BASE_URL, EXOLAB_MODELSEXOLAB_API_KEY
GeminiGEMINI_API_KEYNone
LM StudioLM_STUDIO_API_URLLM_STUDIO_API_KEY
OllamaOLLAMA_API_URLOLLAMA_API_KEY, OLLAMA_HTTP_TIMEOUT
OpenAIOPENAI_API_KEYOPENAI_API_BASE_URL
PerplexityPERPLEXITY_API_KEYNone
VertexAIVERTEXAI_PROJECT_IDVERTEXAI_REGION; also provide Google Application Default Credentials such as GOOGLE_APPLICATION_CREDENTIALS

Other Runtime Options

VariableDefaultDescription
DEFAULT_MODEL_CONTEXT_LENGTHemptyOptional context length override for the default model.
PORTRailway-providedHTTP port. Railway injects this automatically.
FABRIC_CONFIG_DIR/home/appuser/.config/fabricRuntime config directory. Keep this aligned with the Railway volume mount.
FABRIC_APP_DATA_DIR/usr/local/share/fabricRead-only bundled seed data directory inside the image.
PATTERNS_LOADER_GIT_REPO_URLhttps://github.com/danielmiessler/fabric.gitSource repo for pattern loading.
PATTERNS_LOADER_GIT_REPO_PATTERNS_FOLDERdata/patternsPattern folder inside the source repo.
PROMPT_STRATEGIES_GIT_REPO_URLhttps://github.com/danielmiessler/fabric.gitSource repo for prompt strategies.
PROMPT_STRATEGIES_GIT_REPO_STRATEGIES_FOLDERdata/strategiesStrategy folder inside the source repo.
CUSTOM_PATTERNS_DIRECTORYemptyExtra directory for custom Fabric patterns.
LANGUAGE_OUTPUTemptyOptional default language output hint.
JINA_AI_API_KEYemptyOptional Jina API key for web extraction helpers.
YOUTUBE_API_KEYemptyOptional YouTube API key for metadata and comments.
SPOTIFY_CLIENT_IDemptyOptional Spotify integration client ID.
SPOTIFY_CLIENT_SECRETemptyOptional Spotify integration client secret.

Deployment Dependencies

Implementation Details

Template-provided secret:

FABRIC_API_KEY=${{ secret(32) }}

Recommended first-deploy variables:

DEFAULT_VENDOR=OpenRouter
DEFAULT_MODEL=openai/gpt-4.1-mini
OPENROUTER_API_KEY=

OpenAI example:

DEFAULT_VENDOR=OpenAI
DEFAULT_MODEL=gpt-4.1-mini
OPENAI_API_KEY=

After deploy:

curl https:///health
curl -H "X-API-Key: $FABRIC_API_KEY" https:///models/names
curl -H "X-API-Key: $FABRIC_API_KEY" https:///patterns/names

Swagger is available at:

https:///swagger/index.html

Why Deploy Fabric on Railway?

Railway is a singular platform to deploy your infrastructure stack. Railway will host your infrastructure so you don't have to deal with configuration, while allowing you to vertically and horizontally scale it.

By deploying Fabric on Railway, you are one step closer to supporting a complete full-stack application with minimal burden. Host your servers, databases, AI agents, and more on Railway.


Template Content

More templates in this category

View Template
Chat Chat
Chat Chat, your own unified chat and search to AI platform.

okisdev
View Template
EchoDeck
Generate a mp4 from powerpoint with TTS

Fixed Scope
View Template
Rift
Rift Its a OSS AI Chat for teams

Compound