Ragpi

Ragpi

🤖 An AI assistant answering questions from your documentation

Deploy Ragpi

postgres

pgvector/pgvector:pg17

Just deployed

/var/lib/postgresql/data

ragpi-api

ragpi/ragpi

Just deployed

ragpi-worker

ragpi/ragpi

Just deployed

redis

bitnami/redis:7.2.5

Just deployed

/bitnami

Ragpi is an open-source AI assistant API that answers questions using your documentation, GitHub issues, and READMEs. It combines LLMs with intelligent search to provide relevant, documentation-backed answers through a simple API.

Documentation | API Reference

Key Features

  • 📚 Builds knowledge bases from docs, GitHub issues and READMEs
  • 🤖 Agentic RAG system for dynamic document retrieval
  • 🔌 Supports OpenAI, Ollama, Deepseek & OpenAI-Compatible models
  • 💬 Discord integration for community support
  • 🚀 API-first design with Docker deployment

Configuring Deployment

You will need to configure the LLM providers you would like to use, i.e the CHAT_PROVIDER and EMBEDDING_PROVIDER environment variables. You will also need to configure any required variables for the provider you choose, e.g. OPENAI_API_KEY if you choose openai as your provider. You can find out more about configuring the providers in the provider documentation and the required environment variables for each provider under their respective pages.

If there are other environment variables you would like to configure that are not in the deployment template, you can add them to your services in the Railway project canvas after deploying the template.

After deploying the core services, Railway will give you a public URL for the ragpi-api service which you can use to access the API. You can also enable API authentication by setting the RAGPI_API_KEY environment variable on the ragpi-api service and using it to authenticate requests to the API.

Deploying Integrations

Each Ragpi integration has its own Railway deployment template. Once you have deployed the core Ragpi services, you can deploy integrations like Slack and Discord by adding a new service to your project canvas. You can do this on your project's Architecture page by clicking the Create button, selecting the Template option, and searching for the integration you want to deploy, e.g., Ragpi Discord Integration or Ragpi Slack Integration.

After selecting the integration template, you will need to configure the required environment variables for the integration. The RAGPI_BASE_URL environment variable should already be set to the URL of the ragpi-api service you deployed earlier. If you enabled API authentication, the RAGPI_API_KEY environment variable should also be set to the API key you configured for the ragpi-api service. You can find the required environment variables for each integration in the integration's documentation.


Template Content

ragpi-api

ragpi/ragpi

ragpi-worker

ragpi/ragpi
Deploy Now

Details

Ragpi's Projects

Created on Feb 22, 2025

12 total projects

3 active projects

100% success on recent deploys

AI/ML



More templates in this category

View Template
Chat Chat

Chat Chat

Chat Chat, your own unified chat and search to AI platform.


okisdev

View Template
openui

openui

Deploy OpenUI: AI-powered UI generation with GitHub OAuth and OpenAI API.


zexd's Projects

View Template
firecrawl

firecrawl

firecrawl api server + worker without auth, works with dify


Rama