Skip to main content
LangChain is a framework for developing applications powered by language models. Braintrust traces LangChain applications with callback handlers that capture chains, agents, retrievers, tools, and individual LLM calls.
This guide covers manual instrumentation. For quicker setup, use auto-instrumentation.

Setup

Install Braintrust alongside your LangChain packages:
# pnpm
pnpm add braintrust @braintrust/langchain-js @langchain/core @langchain/openai
# npm
npm install braintrust @braintrust/langchain-js @langchain/core @langchain/openai

Trace with LangChain

Calling braintrust.auto_instrument() registers a global LangChain callback handler that traces chains, agents, retrievers, tools, and LLM calls when langchain-core is installed.

Python auto-instrumentation

trace-langchain-auto.py
import braintrust

braintrust.auto_instrument()
braintrust.init_logger(project="My Project")

from langchain_core.prompts import ChatPromptTemplate
from langchain_openai import ChatOpenAI

prompt = ChatPromptTemplate.from_template("What is 1 + {number}?")
model = ChatOpenAI(model="gpt-5-mini")
chain = prompt | model

response = chain.invoke({"number": "2"})
print(response.content)

Manual callback setup

If you want explicit control over where the callback handler is attached, configure BraintrustCallbackHandler yourself:
trace-langchain.ts
import { BraintrustCallbackHandler } from "@braintrust/langchain-js";
import { ChatOpenAI } from "@langchain/openai";
import { initLogger } from "braintrust";

initLogger({
  projectName: "My Project",
  apiKey: process.env.BRAINTRUST_API_KEY,
});

const handler = new BraintrustCallbackHandler();

async function main() {
  const model = new ChatOpenAI({ modelName: "gpt-5-mini" });

  await model.invoke("What is the capital of France?", {
    callbacks: [handler],
  });
}

main();
Run this code:
npx ts-node trace-langchain.ts
The same Python callback handler also works with LangGraph.

Resources