All Templates / AI/ML
ollama-webui
Web UI for Ollama, frontend for LLMs
ollama-webui
ollama-webui/ollama-webui
Just deployed
Features ā š„ļø Intuitive Interface: Our chat interface takes inspiration from ChatGPT, ensuring a user-friendly experience.
š± Responsive Design: Enjoy a seamless experience on both desktop and mobile devices.
ā” Swift Responsiveness: Enjoy fast and responsive performance.
š Effortless Setup: Install seamlessly using Docker for a hassle-free experience.
š» Code Syntax Highlighting: Enjoy enhanced code readability with our syntax highlighting feature.
āļøš¢ Full Markdown and LaTeX Support: Elevate your LLM experience with comprehensive Markdown and LaTeX capabilities for enriched interaction.
š„šļø Download/Delete Models: Easily download or remove models directly from the web UI.
š¤ Multiple Model Support: Seamlessly switch between different chat models for diverse interactions.
āļø Many Models Conversations: Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel.
š¤ OpenAI Model Integration: Seamlessly utilize OpenAI models alongside Ollama models for a versatile conversational experience.
š Regeneration History Access: Easily revisit and explore your entire regeneration history.
š Chat History: Effortlessly access and manage your conversation history.
š¤š„ Import/Export Chat History: Seamlessly move your chat data in and out of the platform.
š£ļø Voice Input Support: Engage with your model through voice interactions; enjoy the convenience of talking to your model directly. Additionally, explore the option for sending voice input automatically after 3 seconds of silence for a streamlined experience.
āļø Fine-Tuned Control with Advanced Parameters: Gain a deeper level of control by adjusting parameters such as temperature and defining your system prompts to tailor the conversation to your specific preferences and needs.
š Auth Header Support: Effortlessly enhance security by adding Authorization headers to Ollama requests directly from the web UI settings, ensuring access to secured Ollama servers.
š External Ollama Server Connection: Seamlessly link to an external Ollama server hosted on a different address by configuring the environment variable during the Docker build phase. Additionally, you can also set the external server connection URL from the web UI post-build.
š Backend Reverse Proxy Support: Strengthen security by enabling direct communication between Ollama Web UI backend and Ollama, eliminating the need to expose Ollama over LAN.
š Continuous Updates: We are committed to improving Ollama Web UI with regular updates and new features.
Template Content
ollama-webui
ollama-webui/ollama-webuiDetails
James Clayton's Projects
Created on Nov 26, 2023
202 total projects
68 active projects
20% success on recent deploys
Svelte, Python, TypeScript, JavaScript, CSS, HTML, Dockerfile, Shell
AI/ML
More templates in this category
openui
Deploy OpenUI: AI-powered UI generation with GitHub OAuth and OpenAI API.
zexd's Projects