Skip to Content
Confident AI is free to try . No credit card required.
Docs
LLM Tracing
Introduction

Introduction to LLM Tracing on Confident AI

Confident AI offers LLM tracing for teams to trace and monitor LLM applications. Think Datadog for LLM apps.

Why Use LLM Tracing & Observability on Confident AI?

  • Native to DeepEval, the most widely used LLM evaluation framework in the world.
  • Tracing is evals-first, you can trace and evaluate literally any component (retrievers, LLMs, tools, agents).
  • Only platform where you can:
    • Leverage DeepEval’s 50+ metrics.
    • Run evaluations on:
      • Individual spans (component-level)
      • Traces (end-to-end)
      • Threads (conversation evals)
    • Access unlimited evaluation use cases for chatbots, text-to-SQL, RAG pipelines, agentic workflows, document Q&A, summarization, code generation, translation, content moderation, and more.
Loading video...

LLM Tracing for an Agentic RAG App

0 views • 0 days ago
Confident AI Logo
Confident AI
100K subscribers
0

Get Started

Get LLM tracing for your LLM app with best in-class-evals.

Advanced Features

You can configure tracing on Confident AI in virtually any way you wish:

FAQs

What evals are offered by Confident AI LLM tracing?

You can run evaluations using metrics for RAG, agents, chatbots, on:

  1. Traces (end-to-end)
  2. Spans (individual components)
  3. Threads (multi-turn conversations)

And these are be either done in an online fashion (run evals as they are being ingested in the platform), or offline (run evals retrospectively).

How will tracing affect my app?

Confident AI tracing is designed to be completely non-intrusive to your application. It:

  • Can be disabled/enabled anytime through the ENABLE_CONFIDENT_TRACING="YES"/"NO" enviornment variable.
  • Requires no rewrite of your existing code - just add the @observe decorator.
  • Runs asynchronously in the background with zero impact on latency.
  • Fails silently if there are any issues, ensuring your app keeps running.
  • Works with any function signature - you can set input/output at runtime.
Last updated on