All Templates / Observability
Doku
Doku is an open-source LLMOps tool to monitor & evaluate LLM applications
Doku Client
dokulabs/doku-client:latest
Just deployed
/app/client/data
Doku Ingester
dokulabs/doku-ingester:latest
Just deployed
Clickhouse
clickhouse/clickhouse-server:24.2.2-alpine
Just deployed
/var/lib/clickhouse
Documentation | Quickstart | Python SDK | Node SDK | Helm Chart
Doku is an open-source LLMOps tool engineered to enables developers with comprehensive capabilities to monitor, analyze, and optimize LLM applications. It provides valuable real-time data on LLM usage, performance, and costs. Through seamless integrations with leading LLM platforms, including OpenAI, Cohere, Mistral and Anthropic, Doku acts as a central command center for all your LLM needs. It effectively guides your efforts, ensuring that your LLM applications not only operate at peak efficiency but also scale successfully.
This template automates the installation and pre-configuration of the following components:
With Doku running, the next step is to access the Doku UI and generate an API key for secure communication between your applications and Doku.
https://doku-client-.up.railway.app/login
user@dokulabs.com
dokulabsuser
> đź’ˇ Tip: Alternatively, you can use the HTTP API to create your Doku API Key. For further details, take a look at the API Reference section.
Choose the appropriate SDK for your LLM application's programming language and follow the steps to integrate monitoring with just two lines of code.
Python
Install the dokumetry
Python SDK using pip:
pip install dokumetry
Add the following two lines to your application code:
import dokumetry
dokumetry.init(llm=client, doku_url="https://doku-ingester-.up.railway.app/", api_key="YOUR_DOKU_TOKEN")
OpenAI
Usage:from openai import OpenAI
import dokumetry
client = OpenAI(
api_key="YOUR_OPENAI_KEY"
)
# Pass the above `client` object along with your Doku Ingester URL and API key and this will make sure that all OpenAI calls are automatically tracked.
dokumetry.init(llm=client, doku_url="https://doku-ingester-.up.railway.app/", api_key="YOUR_DOKU_TOKEN")
chat_completion = client.chat.completions.create(
messages=[
{
"role": "user",
"content": "What is LLM Observability",
}
],
model="gpt-3.5-turbo",
)
Refer to the dokumetry
Python SDK repository for more advanced configurations and use cases.
Install the dokumetry
NodeJS SDK using npm:
npm install dokumetry
Add the following two lines to your application code:
import DokuMetry from 'dokumetry';
DokuMetry.init({llm: openai, dokuUrl: "https://doku-ingester-.up.railway.app/", apiKey: "YOUR_DOKU_TOKEN"})
OpenAI
Usage:import OpenAI from 'openai';
import DokuMetry from 'dokumetry';
const openai = new OpenAI({
apiKey: "YOUR_OPENAI_KEY",
});
// Pass the above `openai` object along with your Doku Ingester URL and API key and this will make sure that all OpenAI calls are automatically tracked.
DokuMetry.init({llm: openai, dokuUrl: "https://doku-ingester-.up.railway.app/", apiKey: "YOUR_DOKU_TOKEN"})
async function main() {
const chatCompletion = await openai.chat.completions.create({
messages: [{ role: 'user', content: 'What are the key to effective observability?' }],
model: 'gpt-3.5-turbo',
});
}
main();
Refer to the dokumetry
NodeJS SDK repository for more advanced configurations and use cases.
Once you have Doku Ingester and DokuMetry
SDKs set up, you can instantly get insights into how your LLM applications in the Doku Client UI. Just head over to https://doku-client-.up.railway.app
on your browser to start exploring.
With Doku, you get a simple, powerful view into important info like how much you’re spending on LLMs, which parts of your app are using them the most, and how well they’re performing. Find out which LLM models are favorites among your applications, and dive deep into performance details to make smart decisions. This setup is perfect for optimizing your app performance and keeping an eye on costs.
Template Content
Doku Client
ghcr.io/dokulabs/doku-client:latestDoku Ingester
ghcr.io/dokulabs/doku-ingester:latestDetails
More templates in this category
OpenTelemetry Collector and Backend
OpenTelemetry Collector with Backend Stack
Melissa
46
pgweb | Postgres UI
View and query your Postgres instance with a sleek and minimalistic UI.
Cory "GH5T" James
Elasticsearch-Kibana
Elasticsearch is a search engine based on the Lucene library.
Mex Delgado
51