Railway

Deploy FastAPIChat

A FastAPI App for benchmarking LLM models from various vendors.

Deploy FastAPIChat

/var/lib/postgresql/data

Just deployed

Overview

FastAPIChat is a modern, high-performance application for benchmarking LLM models from different vendors. Built with FastAPI and Python 3.9+, it leverages the power of FastAPI's asynchronous capabilities and provides an extensive system for collecting user feedback.

Highlights

  • Multi-LLM Benchmarking: Easily compare and evaluate the performance of LLM models from various vendors with different system prompts and temperature settings.
  • Feedback Collection: Implement a robust feedback system to rate LLM outputs, enabling data collection for future fine-tuning and improvements.
  • Streaming Responses: Integrated with FastAPI for real-time, non-blocking streaming responses, enhancing the user experience with immediate feedback.

Learn More


Template Content

More templates in this category

View Template
Chat Chat
Chat Chat, your own unified chat and search to AI platform.

okisdev
View Template
Hermes Agent | OpenClaw Alternative with Dashboard
Self-improving AI agent with memory, skills, and web dashboard 🤖

codestorm
View Template
EchoDeck
Generate a mp4 from powerpoint with TTS

Fixed Scope