TracingOpenTelemetry (OTel)

OpenTelemetry (OTel) Integration

Send traces from any AI framework to AIGodfather — zero code changes, just two environment variables.

How It Works

AIGodfather exposes an OTLP-compatible HTTP endpoint at /api/otlp/v1/traces. Any framework that supports OpenTelemetry can send traces directly to your AIGodfather dashboard.

The endpoint automatically:

  • Creates traces and spans in your existing Supabase tables
  • Maps OTel semantic conventions (gen_ai.*) to our data model
  • Detects span types (LLM, tool, retrieval) from attributes
  • Extracts token usage, model info, and cost data
  • De-duplicates spans via upsert

Quick Setup (2 env vars)

Most OTel-compatible frameworks auto-detect these standard environment variables:

1

Set the endpoint

Point the OTLP exporter to your AIGodfather instance:

OTEL_EXPORTER_OTLP_ENDPOINT=https://www.aigodfather.ai/api/otlp
2

Set the API key header

Use your agent API key for authentication:

OTEL_EXPORTER_OTLP_HEADERS=x-api-key=agf_live_your_key_here
Your existing agent API keys work with OpenTelemetry — no separate key needed. Go to Dashboard → API Keys to copy yours.

Supported Frameworks

Any framework that supports OpenTelemetry works out of the box. See the code panel → for specific examples for each framework.

OpenAI Agents SDK

The OpenAI Agents SDK has built-in OTel support via OpenTelemetrySpanProcessor. Configure the OTLP exporter and set it as the trace processor.

LangChain / LangGraph

LangChain auto-detects the OTEL_EXPORTER_OTLP_* env vars. Just set the two variables — zero code changes.

LlamaIndex

LlamaIndex supports OpenTelemetry instrumentation. Configure a TracerProvider with the OTLP exporter.

CrewAI

CrewAI reads the standard OTel environment variables. Set them before importing CrewAI.

Generic (any OTel framework)

For any JavaScript or Python framework, configure the OTLPTraceExporter / OTLPSpanExporter with your AIGodfather endpoint and API key.

Attribute Mapping

The receiver maps standard gen_ai semantic conventions to AIGodfather columns:

OTel AttributeAIGodfather Column
gen_ai.systemprovider
gen_ai.request.modelmodel
gen_ai.usage.prompt_tokensprompt_tokens
gen_ai.usage.completion_tokenscompletion_tokens
gen_ai.usage.total_tokenstotal_tokens
gen_ai.promptinput
gen_ai.completionoutput
⚠️ The OTLP endpoint authenticates via the x-api-key header. Make sure your API key is valid and belongs to an active agent.
class=class="tk-str">"tk-cmt"># OpenAI Agents SDK — built-in OTel support
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from agents.tracing import set_trace_processor
from agents.tracing.processors import OpenTelemetrySpanProcessor

provider = TracerProvider()
provider.add_span_processor(
    BatchSpanProcessor(
        OTLPSpanExporter(
            endpoint=class="tk-str">"https:class="tk-cmtclass="tk-str">">//www.aigodfather.ai/api/otlp/v1/traces",
            headers={class="tk-str">"x-api-key": class="tk-str">"agf_live_your_key_here"}
        )
    )
)

set_trace_processor(OpenTelemetrySpanProcessor(provider))

class=class="tk-str">"tk-cmt"># Now run your agent as normal — traces appear in AIGodfather
from agents import Agent, Runner

agent = Agent(name=class="tk-str">"support-bot", instructions=class="tk-str">"Help users")
result = Runner.run_sync(agent, class="tk-str">"How do I reset my password?")
print(result.final_output)