Skip to content

future-agi/traceAI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

374 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

traceAI Logo

traceAI

Open-source observability for AI applications - trace every LLM call, prompt, token, retrieval step, and agent decision.

Built on OpenTelemetry, traceAI sends structured traces to any OTel-compatible backend (Datadog, Grafana, Jaeger, Future AGI, and more). No new vendor. No new dashboard.

License Python TypeScript Java C# OpenTelemetry

PyPI Downloads npm Downloads NuGet Downloads

DocumentationExamplesSlackPyPInpmNuGet


What is traceAI?

traceAI is an open-source library that gives you full visibility into your AI applications. It captures every LLM call, prompt, token count, retrieval step, and agent decision as structured traces and sends them to whatever observability tool you already use.

It is built on OpenTelemetry, the industry standard for application observability. Your AI traces live natively in Datadog, Grafana, Future AGI, Jaeger, or any OTel-compatible backend. No new vendor. No new dashboard.

  • Zero-config tracing for 50+ AI frameworks across 4 languages
  • OpenTelemetry-native - works with any OTel-compatible backend
  • Semantic conventions for LLM calls, agents, tools, retrieval, and vector databases
  • Python, TypeScript, Java, and C# support with consistent APIs

Table of Contents

Key Features

Feature Description
Standardized Tracing Maps AI workflows to consistent OpenTelemetry spans and attributes
Zero-Config Setup Drop-in instrumentation with minimal code changes
Multi-Framework 50+ integrations across Python, TypeScript, Java, and C#
Vendor Agnostic Works with any OpenTelemetry-compatible backend
Rich Context Captures prompts, completions, tokens, model params, tool calls, and more
Production Ready Async support, streaming, error handling, and performance optimized

Quickstart

Python Quickstart

1. Install

pip install traceai-openai

2. Instrument your application

import os
from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
from traceai_openai import OpenAIInstrumentor
import openai

# Set up environment variables
os.environ["FI_API_KEY"] = "<your-api-key>"
os.environ["FI_SECRET_KEY"] = "<your-secret-key>"
os.environ["OPENAI_API_KEY"] = "<your-openai-key>"

# Register tracer provider
trace_provider = register(
    project_type=ProjectType.OBSERVE,
    project_name="my_ai_app"
)

# Instrument OpenAI
OpenAIInstrumentor().instrument(tracer_provider=trace_provider)

# Use OpenAI as normal - tracing happens automatically!
response = openai.chat.completions.create(
    model="gpt-4.1",
    messages=[{"role": "user", "content": "Hello!"}]
)

Tip: Swap traceai-openai for any supported framework (e.g., traceai-langchain, traceai-anthropic)


TypeScript Quickstart

1. Install

npm install @traceai/openai @traceai/fi-core

2. Instrument your application

import { register, ProjectType } from "@traceai/fi-core";
import { OpenAIInstrumentation } from "@traceai/openai";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
import OpenAI from "openai";

// Register tracer provider
const tracerProvider = register({
  projectName: "my_ai_app",
  projectType: ProjectType.OBSERVE,
});

// Register OpenAI instrumentation (before creating client!)
registerInstrumentations({
  tracerProvider,
  instrumentations: [new OpenAIInstrumentation()],
});

// Use OpenAI as normal - tracing happens automatically!
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });

const response = await openai.chat.completions.create({
  model: "gpt-4.1",
  messages: [{ role: "user", content: "Hello!" }],
});

Java Quickstart

1. Add dependency (via JitPack)

<dependency>
    <groupId>com.github.future-agi.traceAI</groupId>
    <artifactId>traceai-java-openai</artifactId>
    <version>v1.0.0</version>
</dependency>

2. Instrument your application

import ai.traceai.core.TraceAI;
import ai.traceai.openai.TracedOpenAIClient;

// Initialize tracing
TraceAI.init("my_ai_app", "your-api-key", "your-secret-key");

// Wrap your OpenAI client
var tracedClient = new TracedOpenAIClient(openAIClient);

// Use as normal - tracing happens automatically!
var response = tracedClient.chatCompletion(request);

C# Quickstart

1. Install

dotnet add package fi-instrumentation-otel

2. Instrument your application

using FIInstrumentation;

// Initialize tracing
var tracer = FITracer.Initialize(new FITracerOptions
{
    ProjectName = "my_ai_app",
    ApiKey = "your-api-key",
    SecretKey = "your-secret-key"
});

// Use the tracer with your AI calls

Supported Frameworks

Python

Package Description Version
fi-instrumentation-otel Core instrumentation library PyPI

LLM Providers

Package Description Version
traceAI-openai OpenAI PyPI
traceAI-anthropic Anthropic PyPI
traceAI-google-genai Google Generative AI PyPI
traceAI-vertexai Google Vertex AI PyPI
traceAI-bedrock AWS Bedrock PyPI
traceAI-mistralai Mistral AI PyPI
traceAI-groq Groq PyPI
traceAI-litellm LiteLLM PyPI
traceAI-cohere Cohere PyPI
traceAI-ollama Ollama PyPI
traceAI-together Together AI PyPI
traceAI-deepseek DeepSeek PyPI
traceAI-fireworks Fireworks AI PyPI
traceAI-cerebras Cerebras PyPI
traceAI-huggingface HuggingFace PyPI
traceAI-xai xAI (Grok) PyPI
traceAI-vllm vLLM PyPI

Agent Frameworks

Package Description Version
traceAI-langchain LangChain PyPI
traceAI-llamaindex LlamaIndex PyPI
traceAI-crewai CrewAI PyPI
traceAI-openai-agents OpenAI Agents PyPI
traceAI-smolagents SmolAgents PyPI
traceAI-autogen AutoGen PyPI
traceAI-google-adk Google ADK PyPI
traceAI-agno Agno PyPI
traceAI-pydantic-ai Pydantic AI PyPI
traceAI-claude-agent-sdk Claude Agent SDK PyPI
traceAI-strands AWS Strands Agents PyPI
traceAI-beeai IBM BeeAI PyPI

Tools and Libraries

Package Description Version
traceAI-haystack Haystack PyPI
traceAI-dspy DSPy PyPI
traceAI-guardrails Guardrails AI PyPI
traceAI-instructor Instructor PyPI
traceAI-portkey Portkey PyPI
traceAI-mcp Model Context Protocol PyPI
traceAI-pipecat Pipecat (Voice AI) PyPI
traceAI-livekit LiveKit (Real-time) PyPI

Vector Databases

Package Description Version
traceAI-pinecone Pinecone PyPI
traceAI-chromadb ChromaDB PyPI
traceAI-qdrant Qdrant PyPI
traceAI-weaviate Weaviate PyPI
traceAI-milvus Milvus PyPI
traceAI-lancedb LanceDB PyPI
traceAI-mongodb MongoDB Atlas Vector Search PyPI
traceAI-pgvector pgvector (PostgreSQL) PyPI
traceAI-redis Redis Vector Search PyPI

TypeScript

Package Description Version
@traceai/fi-core Core instrumentation library npm
@traceai/fi-semantic-conventions Semantic conventions npm

LLM Providers

Package Description Version
@traceai/openai OpenAI npm
@traceai/anthropic Anthropic npm
@traceai/google-genai Google Generative AI npm
@traceai/fi-instrumentation-vertexai Google Vertex AI npm
@traceai/bedrock AWS Bedrock npm
@traceai/mistral Mistral AI npm
@traceai/groq Groq npm
@traceai/cohere Cohere npm
@traceai/ollama Ollama npm
@traceai/together Together AI npm
@traceai/deepseek DeepSeek npm
@traceai/fireworks Fireworks AI npm
@traceai/cerebras Cerebras npm
@traceai/huggingface HuggingFace npm
@traceai/xai xAI (Grok) npm
@traceai/vllm vLLM npm

Agent Frameworks

Package Description Version
@traceai/langchain LangChain.js npm
@traceai/llamaindex LlamaIndex npm
@traceai/openai-agents OpenAI Agents npm
@traceai/fi-instrumentation-google-adk Google ADK npm
@traceai/mastra Mastra npm
@traceai/beeai IBM BeeAI npm
@traceai/strands AWS Strands Agents npm

Tools and Libraries

Package Description Version
@traceai/vercel Vercel AI SDK npm
@traceai/guardrails Guardrails AI npm
@traceai/instructor Instructor npm
@traceai/portkey Portkey npm
@traceai/mcp Model Context Protocol npm
@traceai/fi-instrumentation-pipecat Pipecat (Voice AI) npm
@traceai/fi-instrumentation-livekit LiveKit (Real-time) npm

Vector Databases

Package Description Version
@traceai/pinecone Pinecone npm
@traceai/chromadb ChromaDB npm
@traceai/qdrant Qdrant npm
@traceai/weaviate Weaviate npm
@traceai/milvus Milvus npm
@traceai/lancedb LanceDB npm
@traceai/mongodb MongoDB Atlas Vector Search npm
@traceai/pgvector pgvector (PostgreSQL) npm
@traceai/redis Redis Vector Search npm

Java

Available via JitPack. Add the JitPack repository:

<repositories>
    <repository>
        <id>jitpack.io</id>
        <url>https://jitpack.io</url>
    </repository>
</repositories>
Package Description
traceai-java-core Core instrumentation library

LLM Providers

Package Description
traceai-java-openai OpenAI
traceai-java-azure-openai Azure OpenAI
traceai-java-anthropic Anthropic
traceai-java-google-genai Google Generative AI
traceai-java-cohere Cohere
traceai-java-ollama Ollama
traceai-java-bedrock AWS Bedrock
traceai-java-vertexai Google Vertex AI
traceai-java-watsonx IBM Watsonx

Agent Frameworks

Package Description
traceai-langchain4j LangChain4j
traceai-spring-ai Spring AI
traceai-spring-boot-starter Spring Boot Auto-Configuration
traceai-java-semantic-kernel Microsoft Semantic Kernel

Vector Databases

Package Description
traceai-java-pinecone Pinecone
traceai-java-qdrant Qdrant
traceai-java-milvus Milvus
traceai-java-weaviate Weaviate
traceai-java-chromadb ChromaDB
traceai-java-mongodb MongoDB Atlas Vector Search
traceai-java-redis Redis Vector Search
traceai-java-azure-search Azure AI Search
traceai-java-pgvector pgvector (PostgreSQL)
traceai-java-elasticsearch Elasticsearch

C#

Available on NuGet.

Package Description Version
fi-instrumentation-otel Core instrumentation library NuGet

Compatibility Matrix

Category Framework Python TypeScript Java C#
LLM Providers OpenAI
Anthropic
AWS Bedrock
Google Vertex AI
Google Generative AI
Mistral AI
Groq
Cohere
Ollama
LiteLLM
Together AI
DeepSeek
Fireworks AI
Cerebras
HuggingFace
xAI (Grok)
vLLM
Azure OpenAI
IBM Watsonx
Agent Frameworks LangChain
LlamaIndex
CrewAI
AutoGen
OpenAI Agents
SmolAgents
Google ADK
Agno
Pydantic AI
Claude Agent SDK
AWS Strands Agents
IBM BeeAI
Mastra
LangChain4j
Spring AI
Semantic Kernel
Tools & Libraries Haystack
DSPy
Guardrails AI
Instructor
Portkey
Vercel AI SDK
MCP
Pipecat
LiveKit
Vector Databases Pinecone
ChromaDB
Qdrant
Weaviate
Milvus
LanceDB
MongoDB Atlas
pgvector
Redis
Azure AI Search
Elasticsearch

Legend: ✅ Supported | blank = not yet available


Architecture

traceAI is built on top of OpenTelemetry and follows standard OTel instrumentation patterns:

Full OpenTelemetry Compatibility

  • Works with any OTel-compatible backend
  • Standard OTLP exporters (HTTP/gRPC)
  • Compatible with existing OTel setups

Bring Your Own Configuration

You can use traceAI with your own OpenTelemetry setup:

Python: Custom TracerProvider & Exporters
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from traceai_openai import OpenAIInstrumentor

# Set up your own tracer provider
tracer_provider = TracerProvider()
trace.set_tracer_provider(tracer_provider)

# Add custom exporters (example with Future AGI)
otlp_exporter = OTLPSpanExporter(
    endpoint="https://api.futureagi.com/tracer/v1/traces",
    headers={
        "X-API-KEY": "your-api-key",
        "X-SECRET-KEY": "your-secret-key"
    }
)
tracer_provider.add_span_processor(BatchSpanProcessor(otlp_exporter))

# Instrument with traceAI
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)
TypeScript: Custom TracerProvider, Span Processors & Headers
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
import { Resource } from "@opentelemetry/resources";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
import { OpenAIInstrumentation } from "@traceai/openai";

const provider = new NodeTracerProvider({
  resource: new Resource({ "service.name": "my-ai-service" }),
});

const exporter = new OTLPTraceExporter({
  url: "https://api.futureagi.com/tracer/v1/traces",
  headers: {
    "X-API-KEY": process.env.FI_API_KEY!,
    "X-SECRET-KEY": process.env.FI_SECRET_KEY!,
  },
});

provider.addSpanProcessor(new BatchSpanProcessor(exporter));
provider.register();

registerInstrumentations({
  tracerProvider: provider,
  instrumentations: [new OpenAIInstrumentation()],
});

What Gets Captured

traceAI automatically captures rich telemetry data:

  • Prompts & Completions: Full request/response content
  • Token Usage: Input, output, and total tokens
  • Model Parameters: Temperature, top_p, max_tokens, etc.
  • Tool Calls: Function/tool names, arguments, and results
  • Streaming: Individual chunks with delta tracking
  • Errors: Detailed error context and stack traces
  • Timing: Latency at each step of the AI workflow

All data follows OpenTelemetry Semantic Conventions for GenAI.


Roadmap

  • Go language support
  • Sampling strategies for high-volume production environments
  • Continuous semantic convention updates as the OTel GenAI spec evolves
  • Evaluation integration connecting traces to quality measurement pipelines
  • Expanded agent framework coverage

See our ROADMAP.md for the full roadmap.


Contributing

We welcome contributions! Read our Contributing Guide for details.

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Found a bug? Open an issue with a minimal reproduction.


Contributors


Resources

Resource Description
Website Learn more about Future AGI
Documentation Complete guides and API reference
Cookbooks Step-by-step implementation examples
Roadmap Planned features and integrations
Changelog All release notes and updates
Contributing Guide How to contribute to traceAI
Slack Join our community
Issues Report bugs or request features

Connect With Us

Website LinkedIn Twitter Reddit Substack


Built with care by the Future AGI team

Star us on GitHub | Report Bug | Request Feature

About

Open Source AI Tracing Framework built on Opentelemetry for AI Applications and Frameworks

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors