- Get real-time alerts about your model’s quality
- Execution tracing for every request
- Gradually rollout changes to models and prompts
- Debug and re-run issues from production in your IDE
Get Started - with OpenLLMetry SDK or Traceloop Hub
Traceloop uses OpenTelemetry to monitor and trace your LLM application. You can install the OpenLLMetry SDK in your application, or use Traceloop Hub as a smart proxy to all your LLM calls. To get started, pick the language you are using and follow the instructions.Hub
Beta
Python SDK
Available
Typescript SDK
Available
Go SDK
Beta
Ruby SDK
Beta

