LLM Observability

Agentgateway can send LLM telemetry to specialized observability platforms for prompt logging, cost tracking, audit trail, and performance monitoring.

How it works

Agentgateway exports LLM telemetry via OpenTelemetry, which can be forwarded to LLM-specific observability platforms. These platforms provide the following.

  • Prompt/response logging - Full request and response capture
  • Token usage tracking - Monitor costs across models and users
  • Latency analytics - Track response times and identify bottlenecks
  • Evaluation - Score and evaluate LLM outputs
  • Prompt management - Version and manage prompts

Configuration

Enable OpenTelemetry tracing with LLM-specific attributes.

# yaml-language-server: $schema=https://agentgateway.dev/schema/config
config:
  tracing:
    otlpEndpoint: http://localhost:4317
    randomSampling: true

binds:
- port: 3000
  listeners:
  - routes:
    - backends:
      - ai:
          name: openai
          provider:
            openAI:
              model: gpt-4o-mini
      policies:
        backendAuth:
          key: "$OPENAI_API_KEY"

Agentgateway automatically includes these LLM-specific trace attributes:

AttributeDescription
gen_ai.operation.nameOperation type (chat, completion, embedding)
gen_ai.request.modelRequested model name
gen_ai.response.modelActual model used
gen_ai.usage.input_tokensInput token count
gen_ai.usage.output_tokensOutput token count
gen_ai.provider.nameLLM provider (openai, anthropic, etc.)
Agentgateway assistant

Ask me anything about agentgateway configuration, features, or usage.

Note: AI-generated content might contain errors; please verify and test all returned information.

Tip: one topic per conversation gives the best results. Use the + button in the chat header to start a new conversation.

Switching topics? Starting a new conversation improves accuracy.
↑↓ navigate select esc dismiss

What could be improved?

Your feedback helps us improve assistant answers and identify docs gaps we should fix.

Need more help? Join us on Discord: https://discord.gg/y9efgEmppm

Want to use your own agent? Add the Solo MCP server to query our docs directly. Get started here: https://search.solo.io/.