OpenTelemetry (OTel) Integration
Send traces from any AI framework to AIGodfather — zero code changes, just two environment variables.
How It Works
AIGodfather exposes an OTLP-compatible HTTP endpoint at /api/otlp/v1/traces. Any framework that supports OpenTelemetry can send traces directly to your AIGodfather dashboard.
The endpoint automatically:
- Creates traces and spans in your existing Supabase tables
- Maps OTel semantic conventions (
gen_ai.*) to our data model - Detects span types (LLM, tool, retrieval) from attributes
- Extracts token usage, model info, and cost data
- De-duplicates spans via upsert
Quick Setup (2 env vars)
Most OTel-compatible frameworks auto-detect these standard environment variables:
Set the endpoint
Point the OTLP exporter to your AIGodfather instance:
OTEL_EXPORTER_OTLP_ENDPOINT=https://www.aigodfather.ai/api/otlpSet the API key header
Use your agent API key for authentication:
OTEL_EXPORTER_OTLP_HEADERS=x-api-key=agf_live_your_key_hereSupported Frameworks
Any framework that supports OpenTelemetry works out of the box. See the code panel → for specific examples for each framework.
OpenAI Agents SDK
The OpenAI Agents SDK has built-in OTel support via OpenTelemetrySpanProcessor. Configure the OTLP exporter and set it as the trace processor.
LangChain / LangGraph
LangChain auto-detects the OTEL_EXPORTER_OTLP_* env vars. Just set the two variables — zero code changes.
LlamaIndex
LlamaIndex supports OpenTelemetry instrumentation. Configure a TracerProvider with the OTLP exporter.
CrewAI
CrewAI reads the standard OTel environment variables. Set them before importing CrewAI.
Generic (any OTel framework)
For any JavaScript or Python framework, configure the OTLPTraceExporter / OTLPSpanExporter with your AIGodfather endpoint and API key.
Attribute Mapping
The receiver maps standard gen_ai semantic conventions to AIGodfather columns:
| OTel Attribute | AIGodfather Column |
|---|---|
gen_ai.system | provider |
gen_ai.request.model | model |
gen_ai.usage.prompt_tokens | prompt_tokens |
gen_ai.usage.completion_tokens | completion_tokens |
gen_ai.usage.total_tokens | total_tokens |
gen_ai.prompt | input |
gen_ai.completion | output |
x-api-key header. Make sure your API key is valid and belongs to an active agent.