Automatic LLM Observability tracing for OpenAI, Anthropic, LangChain, LangGraph, Vertex AI, Google GenAI, and AWS Bedrock.
dd-trace instruments AI/ML SDKs to automatically capture model inputs, outputs, token counts, and latency. These spans feed into LLM Observability in the Datadog UI.
LLM Observability is automatically active when the AI plugins are loaded. To send LLM spans to the Datadog LLM Observability product, set the following environment variables:
const tracer = require('dd-trace').init()tracer.use('openai', { service: 'openai'})const OpenAI = require('openai')const client = new OpenAI()// Spans are created automatically for completions, embeddings, etc.const response = await client.chat.completions.create({ model: 'gpt-4o', messages: [{ role: 'user', content: 'Hello' }]})
The plugin captures:
Model name and version
Input prompt and completion text
Token counts (prompt, completion, total)
Request duration and status
Prompt and completion content may contain sensitive data. Review your data retention settings in Datadog before enabling content capture in production.
const tracer = require('dd-trace').init()tracer.use('langchain', { service: 'langchain'})// Works with @langchain/core and @langchain/openaiconst { ChatOpenAI } = require('@langchain/openai')const model = new ChatOpenAI({ model: 'gpt-4o' })// Chain invocations, tool calls, and retriever calls are all tracedconst result = await model.invoke('What is the capital of France?')