Open-source observability for AI applications - trace every LLM call, prompt, token, retrieval step, and agent decision.
Built on OpenTelemetry, traceAI sends structured traces to any OTel-compatible backend (Datadog, Grafana, Jaeger, Future AGI, and more). No new vendor. No new dashboard.
traceAI is an open-source library that gives you full visibility into your AI applications. It captures every LLM call, prompt, token count, retrieval step, and agent decision as structured traces and sends them to whatever observability tool you already use.
It is built on OpenTelemetry, the industry standard for application observability. Your AI traces live natively in Datadog, Grafana, Future AGI, Jaeger, or any OTel-compatible backend. No new vendor. No new dashboard.
- Zero-config tracing for 50+ AI frameworks across 4 languages
- OpenTelemetry-native - works with any OTel-compatible backend
- Semantic conventions for LLM calls, agents, tools, retrieval, and vector databases
- Python, TypeScript, Java, and C# support with consistent APIs
- Key Features
- Quickstart
- Supported Frameworks
- Compatibility Matrix
- Architecture
- Roadmap
- Contributing
- Contributors
- Resources
- Connect With Us
| Feature | Description |
|---|---|
| Standardized Tracing | Maps AI workflows to consistent OpenTelemetry spans and attributes |
| Zero-Config Setup | Drop-in instrumentation with minimal code changes |
| Multi-Framework | 50+ integrations across Python, TypeScript, Java, and C# |
| Vendor Agnostic | Works with any OpenTelemetry-compatible backend |
| Rich Context | Captures prompts, completions, tokens, model params, tool calls, and more |
| Production Ready | Async support, streaming, error handling, and performance optimized |
1. Install
pip install traceai-openai2. Instrument your application
import os
from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
from traceai_openai import OpenAIInstrumentor
import openai
# Set up environment variables
os.environ["FI_API_KEY"] = "<your-api-key>"
os.environ["FI_SECRET_KEY"] = "<your-secret-key>"
os.environ["OPENAI_API_KEY"] = "<your-openai-key>"
# Register tracer provider
trace_provider = register(
project_type=ProjectType.OBSERVE,
project_name="my_ai_app"
)
# Instrument OpenAI
OpenAIInstrumentor().instrument(tracer_provider=trace_provider)
# Use OpenAI as normal - tracing happens automatically!
response = openai.chat.completions.create(
model="gpt-4.1",
messages=[{"role": "user", "content": "Hello!"}]
)Tip: Swap
traceai-openaifor any supported framework (e.g.,traceai-langchain,traceai-anthropic)
1. Install
npm install @traceai/openai @traceai/fi-core2. Instrument your application
import { register, ProjectType } from "@traceai/fi-core";
import { OpenAIInstrumentation } from "@traceai/openai";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
import OpenAI from "openai";
// Register tracer provider
const tracerProvider = register({
projectName: "my_ai_app",
projectType: ProjectType.OBSERVE,
});
// Register OpenAI instrumentation (before creating client!)
registerInstrumentations({
tracerProvider,
instrumentations: [new OpenAIInstrumentation()],
});
// Use OpenAI as normal - tracing happens automatically!
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
const response = await openai.chat.completions.create({
model: "gpt-4.1",
messages: [{ role: "user", content: "Hello!" }],
});1. Add dependency (via JitPack)
<dependency>
<groupId>com.github.future-agi.traceAI</groupId>
<artifactId>traceai-java-openai</artifactId>
<version>v1.0.0</version>
</dependency>2. Instrument your application
import ai.traceai.core.TraceAI;
import ai.traceai.openai.TracedOpenAIClient;
// Initialize tracing
TraceAI.init("my_ai_app", "your-api-key", "your-secret-key");
// Wrap your OpenAI client
var tracedClient = new TracedOpenAIClient(openAIClient);
// Use as normal - tracing happens automatically!
var response = tracedClient.chatCompletion(request);1. Install
dotnet add package fi-instrumentation-otel2. Instrument your application
using FIInstrumentation;
// Initialize tracing
var tracer = FITracer.Initialize(new FITracerOptions
{
ProjectName = "my_ai_app",
ApiKey = "your-api-key",
SecretKey = "your-secret-key"
});
// Use the tracer with your AI calls| Package | Description | Version |
|---|---|---|
fi-instrumentation-otel |
Core instrumentation library |
| Package | Description | Version |
|---|---|---|
traceAI-openai |
OpenAI | |
traceAI-anthropic |
Anthropic | |
traceAI-google-genai |
Google Generative AI | |
traceAI-vertexai |
Google Vertex AI | |
traceAI-bedrock |
AWS Bedrock | |
traceAI-mistralai |
Mistral AI | |
traceAI-groq |
Groq | |
traceAI-litellm |
LiteLLM | |
traceAI-cohere |
Cohere | |
traceAI-ollama |
Ollama | |
traceAI-together |
Together AI | |
traceAI-deepseek |
DeepSeek | |
traceAI-fireworks |
Fireworks AI | |
traceAI-cerebras |
Cerebras | |
traceAI-huggingface |
HuggingFace | |
traceAI-xai |
xAI (Grok) | |
traceAI-vllm |
vLLM |
| Package | Description | Version |
|---|---|---|
traceAI-langchain |
LangChain | |
traceAI-llamaindex |
LlamaIndex | |
traceAI-crewai |
CrewAI | |
traceAI-openai-agents |
OpenAI Agents | |
traceAI-smolagents |
SmolAgents | |
traceAI-autogen |
AutoGen | |
traceAI-google-adk |
Google ADK | |
traceAI-agno |
Agno | |
traceAI-pydantic-ai |
Pydantic AI | |
traceAI-claude-agent-sdk |
Claude Agent SDK | |
traceAI-strands |
AWS Strands Agents | |
traceAI-beeai |
IBM BeeAI |
| Package | Description | Version |
|---|---|---|
traceAI-haystack |
Haystack | |
traceAI-dspy |
DSPy | |
traceAI-guardrails |
Guardrails AI | |
traceAI-instructor |
Instructor | |
traceAI-portkey |
Portkey | |
traceAI-mcp |
Model Context Protocol | |
traceAI-pipecat |
Pipecat (Voice AI) | |
traceAI-livekit |
LiveKit (Real-time) |
| Package | Description | Version |
|---|---|---|
traceAI-pinecone |
Pinecone | |
traceAI-chromadb |
ChromaDB | |
traceAI-qdrant |
Qdrant | |
traceAI-weaviate |
Weaviate | |
traceAI-milvus |
Milvus | |
traceAI-lancedb |
LanceDB | |
traceAI-mongodb |
MongoDB Atlas Vector Search | |
traceAI-pgvector |
pgvector (PostgreSQL) | |
traceAI-redis |
Redis Vector Search |
| Package | Description | Version |
|---|---|---|
@traceai/fi-core |
Core instrumentation library | |
@traceai/fi-semantic-conventions |
Semantic conventions |
| Package | Description | Version |
|---|---|---|
@traceai/openai |
OpenAI | |
@traceai/anthropic |
Anthropic | |
@traceai/google-genai |
Google Generative AI | |
@traceai/fi-instrumentation-vertexai |
Google Vertex AI | |
@traceai/bedrock |
AWS Bedrock | |
@traceai/mistral |
Mistral AI | |
@traceai/groq |
Groq | |
@traceai/cohere |
Cohere | |
@traceai/ollama |
Ollama | |
@traceai/together |
Together AI | |
@traceai/deepseek |
DeepSeek | |
@traceai/fireworks |
Fireworks AI | |
@traceai/cerebras |
Cerebras | |
@traceai/huggingface |
HuggingFace | |
@traceai/xai |
xAI (Grok) | |
@traceai/vllm |
vLLM |
| Package | Description | Version |
|---|---|---|
@traceai/langchain |
LangChain.js | |
@traceai/llamaindex |
LlamaIndex | |
@traceai/openai-agents |
OpenAI Agents | |
@traceai/fi-instrumentation-google-adk |
Google ADK | |
@traceai/mastra |
Mastra | |
@traceai/beeai |
IBM BeeAI | |
@traceai/strands |
AWS Strands Agents |
| Package | Description | Version |
|---|---|---|
@traceai/vercel |
Vercel AI SDK | |
@traceai/guardrails |
Guardrails AI | |
@traceai/instructor |
Instructor | |
@traceai/portkey |
Portkey | |
@traceai/mcp |
Model Context Protocol | |
@traceai/fi-instrumentation-pipecat |
Pipecat (Voice AI) | |
@traceai/fi-instrumentation-livekit |
LiveKit (Real-time) |
| Package | Description | Version |
|---|---|---|
@traceai/pinecone |
Pinecone | |
@traceai/chromadb |
ChromaDB | |
@traceai/qdrant |
Qdrant | |
@traceai/weaviate |
Weaviate | |
@traceai/milvus |
Milvus | |
@traceai/lancedb |
LanceDB | |
@traceai/mongodb |
MongoDB Atlas Vector Search | |
@traceai/pgvector |
pgvector (PostgreSQL) | |
@traceai/redis |
Redis Vector Search |
Available via JitPack. Add the JitPack repository:
<repositories>
<repository>
<id>jitpack.io</id>
<url>https://jitpack.io</url>
</repository>
</repositories>| Package | Description |
|---|---|
traceai-java-core |
Core instrumentation library |
| Package | Description |
|---|---|
traceai-java-openai |
OpenAI |
traceai-java-azure-openai |
Azure OpenAI |
traceai-java-anthropic |
Anthropic |
traceai-java-google-genai |
Google Generative AI |
traceai-java-cohere |
Cohere |
traceai-java-ollama |
Ollama |
traceai-java-bedrock |
AWS Bedrock |
traceai-java-vertexai |
Google Vertex AI |
traceai-java-watsonx |
IBM Watsonx |
| Package | Description |
|---|---|
traceai-langchain4j |
LangChain4j |
traceai-spring-ai |
Spring AI |
traceai-spring-boot-starter |
Spring Boot Auto-Configuration |
traceai-java-semantic-kernel |
Microsoft Semantic Kernel |
| Package | Description |
|---|---|
traceai-java-pinecone |
Pinecone |
traceai-java-qdrant |
Qdrant |
traceai-java-milvus |
Milvus |
traceai-java-weaviate |
Weaviate |
traceai-java-chromadb |
ChromaDB |
traceai-java-mongodb |
MongoDB Atlas Vector Search |
traceai-java-redis |
Redis Vector Search |
traceai-java-azure-search |
Azure AI Search |
traceai-java-pgvector |
pgvector (PostgreSQL) |
traceai-java-elasticsearch |
Elasticsearch |
Available on NuGet.
| Package | Description | Version |
|---|---|---|
fi-instrumentation-otel |
Core instrumentation library |
| Category | Framework | Python | TypeScript | Java | C# |
|---|---|---|---|---|---|
| LLM Providers | OpenAI | ✅ | ✅ | ✅ | ✅ |
| Anthropic | ✅ | ✅ | ✅ | ||
| AWS Bedrock | ✅ | ✅ | ✅ | ||
| Google Vertex AI | ✅ | ✅ | ✅ | ||
| Google Generative AI | ✅ | ✅ | ✅ | ||
| Mistral AI | ✅ | ✅ | |||
| Groq | ✅ | ✅ | |||
| Cohere | ✅ | ✅ | ✅ | ||
| Ollama | ✅ | ✅ | ✅ | ||
| LiteLLM | ✅ | ||||
| Together AI | ✅ | ✅ | |||
| DeepSeek | ✅ | ✅ | |||
| Fireworks AI | ✅ | ✅ | |||
| Cerebras | ✅ | ✅ | |||
| HuggingFace | ✅ | ✅ | |||
| xAI (Grok) | ✅ | ✅ | |||
| vLLM | ✅ | ✅ | |||
| Azure OpenAI | ✅ | ||||
| IBM Watsonx | ✅ | ||||
| Agent Frameworks | LangChain | ✅ | ✅ | ||
| LlamaIndex | ✅ | ✅ | |||
| CrewAI | ✅ | ||||
| AutoGen | ✅ | ||||
| OpenAI Agents | ✅ | ✅ | |||
| SmolAgents | ✅ | ||||
| Google ADK | ✅ | ✅ | |||
| Agno | ✅ | ||||
| Pydantic AI | ✅ | ||||
| Claude Agent SDK | ✅ | ||||
| AWS Strands Agents | ✅ | ✅ | |||
| IBM BeeAI | ✅ | ✅ | |||
| Mastra | ✅ | ||||
| LangChain4j | ✅ | ||||
| Spring AI | ✅ | ||||
| Semantic Kernel | ✅ | ||||
| Tools & Libraries | Haystack | ✅ | |||
| DSPy | ✅ | ||||
| Guardrails AI | ✅ | ✅ | |||
| Instructor | ✅ | ✅ | |||
| Portkey | ✅ | ✅ | |||
| Vercel AI SDK | ✅ | ||||
| MCP | ✅ | ✅ | |||
| Pipecat | ✅ | ✅ | |||
| LiveKit | ✅ | ✅ | |||
| Vector Databases | Pinecone | ✅ | ✅ | ✅ | |
| ChromaDB | ✅ | ✅ | ✅ | ||
| Qdrant | ✅ | ✅ | ✅ | ||
| Weaviate | ✅ | ✅ | ✅ | ||
| Milvus | ✅ | ✅ | ✅ | ||
| LanceDB | ✅ | ✅ | |||
| MongoDB Atlas | ✅ | ✅ | ✅ | ||
| pgvector | ✅ | ✅ | ✅ | ||
| Redis | ✅ | ✅ | ✅ | ||
| Azure AI Search | ✅ | ||||
| Elasticsearch | ✅ |
Legend: ✅ Supported | blank = not yet available
traceAI is built on top of OpenTelemetry and follows standard OTel instrumentation patterns:
Full OpenTelemetry Compatibility
- Works with any OTel-compatible backend
- Standard OTLP exporters (HTTP/gRPC)
- Compatible with existing OTel setups
Bring Your Own Configuration
You can use traceAI with your own OpenTelemetry setup:
Python: Custom TracerProvider & Exporters
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from traceai_openai import OpenAIInstrumentor
# Set up your own tracer provider
tracer_provider = TracerProvider()
trace.set_tracer_provider(tracer_provider)
# Add custom exporters (example with Future AGI)
otlp_exporter = OTLPSpanExporter(
endpoint="https://api.futureagi.com/tracer/v1/traces",
headers={
"X-API-KEY": "your-api-key",
"X-SECRET-KEY": "your-secret-key"
}
)
tracer_provider.add_span_processor(BatchSpanProcessor(otlp_exporter))
# Instrument with traceAI
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)TypeScript: Custom TracerProvider, Span Processors & Headers
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
import { Resource } from "@opentelemetry/resources";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
import { OpenAIInstrumentation } from "@traceai/openai";
const provider = new NodeTracerProvider({
resource: new Resource({ "service.name": "my-ai-service" }),
});
const exporter = new OTLPTraceExporter({
url: "https://api.futureagi.com/tracer/v1/traces",
headers: {
"X-API-KEY": process.env.FI_API_KEY!,
"X-SECRET-KEY": process.env.FI_SECRET_KEY!,
},
});
provider.addSpanProcessor(new BatchSpanProcessor(exporter));
provider.register();
registerInstrumentations({
tracerProvider: provider,
instrumentations: [new OpenAIInstrumentation()],
});What Gets Captured
traceAI automatically captures rich telemetry data:
- Prompts & Completions: Full request/response content
- Token Usage: Input, output, and total tokens
- Model Parameters: Temperature, top_p, max_tokens, etc.
- Tool Calls: Function/tool names, arguments, and results
- Streaming: Individual chunks with delta tracking
- Errors: Detailed error context and stack traces
- Timing: Latency at each step of the AI workflow
All data follows OpenTelemetry Semantic Conventions for GenAI.
- Go language support
- Sampling strategies for high-volume production environments
- Continuous semantic convention updates as the OTel GenAI spec evolves
- Evaluation integration connecting traces to quality measurement pipelines
- Expanded agent framework coverage
See our ROADMAP.md for the full roadmap.
We welcome contributions! Read our Contributing Guide for details.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Found a bug? Open an issue with a minimal reproduction.
| Resource | Description |
|---|---|
| Website | Learn more about Future AGI |
| Documentation | Complete guides and API reference |
| Cookbooks | Step-by-step implementation examples |
| Roadmap | Planned features and integrations |
| Changelog | All release notes and updates |
| Contributing Guide | How to contribute to traceAI |
| Slack | Join our community |
| Issues | Report bugs or request features |
Built with care by the Future AGI team
