Deploy AI-Trader

AI-Trader with MCP tools, UI dashboard. AI Trader IA Trader

Deploy AI-Trader

Just deployed

/app/data

Deploy and Host AI-Trader on Railway

AI-Trader is an autonomous trading benchmark that lets multiple LLM agents compete in historical market simulations. Agents research market context, analyze price data, and execute simulated trades using a tool-driven (MCP) architecture. It includes a lightweight web dashboard to visualize portfolio performance, trades, and leaderboard comparisons across models.

About Hosting AI-Trader

This template deploys AI-Trader as a single Railway service using Docker. The container starts the MCP tool services (math, search, trade execution, and local price lookup), runs a configured trading simulation, and serves the static UI dashboard over Railway’s public HTTP port. A persistent Railway Volume is mounted to store datasets and run outputs (agent logs, positions, transactions) so results survive restarts and redeploys. To run real agent decisions, you must provide API keys for your LLM provider (OpenAI-compatible) and optionally a search provider.

Common Use Cases

  • Benchmark multiple LLMs on the same market data and ruleset (competition arena)
  • Reproduce historical “paper trading” experiments with fixed date ranges
  • Visualize portfolio evolution, trade actions, and model comparisons in a web dashboard

Dependencies for AI-Trader Hosting

  • An OpenAI-compatible API key (for the trading agent’s LLM calls)
  • A persistent Railway Volume mounted at /app/data (to store datasets and run outputs)

Deployment Dependencies

Implementation Details

Environment variables used by this template:

# AI-Trader runtime
CONFIG_PATH="/app/configs/default_config.json" # Scenario config to run (US market, daily)
RUNTIME_ENV_PATH="/app/runtime_env.json"       # Runtime environment config path
AGENT_MAX_STEP="30"                            # Max reasoning/action steps per run/day

# OpenAI-compatible LLM provider (required)
OPENAI_API_BASE="https://api.openai.com/v1"    # Must end with /v1
OPENAI_API_KEY=""                              # Your OpenAI API key

# Search provider (optional, recommended if enabled in config)
JINA_API_KEY=""                                # Your Jina API key

# MCP internal tool ports (localhost only)
MATH_HTTP_PORT="8000"                          # Math tools
SEARCH_HTTP_PORT="8001"                        # Search / market intelligence
TRADE_HTTP_PORT="8002"                         # Trade execution simulator
GETPRICE_HTTP_PORT="8003"                      # Local price lookup
CRYPTO_HTTP_PORT="8005"                        # Crypto trading tools

Notes:

  • The UI is served on Railway’s assigned $PORT (public), while MCP tool ports are internal only.
  • Mount a persistent volume at /app/data to keep datasets and agent outputs across restarts.

Why Deploy AI-Trader on Railway?

Railway is a singular platform to deploy your infrastructure stack. Railway will host your infrastructure so you don't have to deal with configuration, while allowing you to vertically and horizontally scale it.

By deploying AI-Trader on Railway, you are one step closer to supporting a complete full-stack application with minimal burden. Host your servers, databases, AI agents, and more on Railway.


Template Content

More templates in this category

View Template
Chat Chat
Chat Chat, your own unified chat and search to AI platform.

View Template
openui
Deploy OpenUI: AI-powered UI generation with GitHub OAuth and OpenAI API.

View Template
firecrawl
firecrawl api server + worker without auth, works with dify