Skip to main content

Overview

Traces provide visibility into your LLM function executions. Every time a function is called via the SDK, a trace is recorded with:
  • Inputs: The data passed to the function
  • Outputs: The result returned by the LLM
  • Metadata: Timing, token usage, model information
  • Errors: Any errors that occurred during execution

Viewing Traces

Navigate to Traces in the sidebar to see all traces for your organization.

Trace List

The trace list shows:
ColumnDescription
FunctionThe function that was called
StatusSuccess, error, or pending
DurationExecution time in milliseconds
TokensInput and output token count
CreatedWhen the trace was recorded

Filtering Traces

Filter traces by:
  • Function: Select a specific function
  • Status: Success, error, or all
  • Date Range: Filter by time period
  • Source: Internal (function calls) or external (SDK tracing)

Searching Traces

Use the search bar to find traces by:
  • Input content
  • Output content
  • Error messages
  • Trace ID

Trace Details

Click on a trace to view details:

Input

The exact input passed to the function:
{
  "text": "My name is John Doe and I work at Acme Corp."
}

Output

The structured output returned:
{
  "firstName": "John",
  "lastName": "Doe"
}

Execution Details

FieldDescription
ModelThe LLM model used (e.g., gpt-4o)
DurationTotal execution time
Input TokensTokens in the prompt
Output TokensTokens in the response
Total TokensCombined token count
Function VersionThe version of the function used

Raw Request/Response

View the raw LLM API request and response for debugging:
  • Messages: The full conversation sent to the LLM
  • Response: The raw response from the LLM API
  • Headers: Request headers and metadata

External Traces

External traces are captured from SDK tracing integrations (e.g., OpenAI Agents SDK):

Viewing External Traces

  1. Navigate to Traces
  2. Use the Source filter to select External
  3. Browse traces from your SDK integrations

External Trace Details

External traces include:
  • Workflow Name: The name of the workflow or agent
  • Group ID: Links related traces together
  • Spans: Individual LLM calls within the trace
  • Metadata: Custom metadata from your application

Span View

Each external trace may contain multiple spans:
Trace: Customer Support Bot
├── Span: Intent Classification (gpt-4o-mini)
├── Span: Knowledge Retrieval (gpt-4o)
└── Span: Response Generation (gpt-4o)
Click on individual spans to see their inputs, outputs, and timing.

Tags

Organize traces with tags:

Adding Tags

  1. Open a trace
  2. Click Add Tag
  3. Select an existing tag or create a new one

Filtering by Tag

  1. Go to the trace list
  2. Click the tag filter
  3. Select one or more tags

Managing Tags

Navigate to Settings > Tags to:
  • Create new tags
  • Edit tag names and colors
  • Archive unused tags

Exporting Traces

Export traces for analysis:
  1. Filter to the traces you want to export
  2. Click Export
  3. Choose format (JSON or CSV)
  4. Download the file

Best Practices

  • Monitor Errors: Regularly check for failed traces to identify issues
  • Track Latency: Use duration data to optimize slow functions
  • Use Tags: Organize traces by feature, environment, or user
  • Review Outputs: Periodically review outputs to ensure quality