TypeScript SDK - Instrumentation
To instrument your application to send traces to Langfuse, you can use
- Native instrumentation of llm/agent libraries for out-of-the-box tracing
- Custom instrumentation methods for fine-grained control
- Context manager:
startActiveObservation
- Wrapper:
observe
- Manual:
startObservation
- Context manager:
Native instrumentation
Langfuse integrates with many llm/agent libraries to automatically trace your application. For a full list, see the Langfuse Integrations page.
These are the most popular ones:
The @langfuse/openai
package provides a wrapper to automatically trace calls to the OpenAI SDK.
For an end-to-end example, see the Langfuse + OpenAI JS/TS Cookbook.
Installation:
npm install @langfuse/openai
Usage:
The observeOpenAI
function wraps your OpenAI client instance. All subsequent API calls made with the wrapped client will be traced as generations and nested automatically in the current trace tree. If there’s no active trace in context, a new one will be created automatically.
import { OpenAI } from "openai";
import { observeOpenAI } from "@langfuse/openai";
// Instantiate the OpenAI client as usual
const openai = new OpenAI();
// Wrap it with Langfuse
const tracedOpenAI = observeOpenAI(openai, {
// Pass trace-level attributes that will be applied to all calls
traceName: "my-openai-trace",
sessionId: "user-session-123",
userId: "user-abc",
tags: ["openai-integration"],
});
// Use the wrapped client just like the original
const completion = await tracedOpenAI.chat.completions.create({
model: "gpt-4",
messages: [{ role: "user", content: "What is OpenTelemetry?" }],
});
Custom instrumentation
You can add custom instrumentations to your application via
- the
observe
wrapper startActiveObservation
context managers- manually managing the observation lifecycle and its nesting with the
startObservation
function
For an end-to-end example, see the JS Instrumentation Cookbook.
Context management with callbacks
To simplify nesting and context management, you can use startActiveObservation
. These functions take a callback and automatically manage the observation’s lifecycle and the OpenTelemetry context. Any observation created inside the callback will automatically be nested under the active observation, and the observation will be ended when the callback finishes.
This is the recommended approach for most use cases as it prevents context leakage and ensures observations are properly ended.
import { startActiveObservation, startObservation } from "@langfuse/tracing";
await startActiveObservation(
// name
"user-request",
// callback
async (span) => {
span.update({
input: { query: "What is the capital of France?" },
});
// Example child, could also use startActiveObservation
// This manually created generation (see docs below) will automatically be a child of "user-request"
const generation = startObservation(
"llm-call",
{
model: "gpt-4",
input: [{ role: "user", content: "What is the capital of France?" }],
},
{ asType: "generation" }
);
generation.update({
usageDetails: { input: 10, output: 5 },
output: { content: "The capital of France is Paris." },
});
generation.end();
span.update({ output: "Successfully answered." });
}
);
observe
wrapper
The observe
wrapper is a powerful tool for tracing existing functions without modifying their internal logic. It acts as a decorator that automatically creates a span or generation around the function call. You can use the updateActiveObservation
function to add attributes to the observation from within the wrapped function.
import { observe, updateActiveObservation } from "@langfuse/tracing";
// An existing function
async function fetchData(source: string) {
updateActiveObservation({ metadata: { source: "API" } }, "span");
// ... logic to fetch data
return { data: `some data from ${source}` };
}
// Wrap the function to trace it
const tracedFetchData = observe(
// method
fetchData,
// options, optional, see below
{}
);
// Now, every time you call tracedFetchData, a span is created.
// Its input and output are automatically populated with the
// function's arguments and return value.
const result = await tracedFetchData("API");
You can configure the observe
wrapper by passing an options object as the second argument:
Option | Description | Default |
---|---|---|
name | The name of the observation. | The original function’s name. |
asType | The type of observation to create (e.g. span , generation ). | "span" |
captureInput | Whether to capture the function’s arguments as the input of the observation. | true |
captureOutput | Whether to capture the function’s return value or thrown error as the output of the observation. | true |
Manual observations
The core tracing function (startObservation
) gives you full control over creating observations. You can pass the asType
option to specify the type of observation to create.
When you call one of these functions, the new observation is automatically linked as a child of the currently active operation in the OpenTelemetry context. However, it does not make this new observation the active one. This means any further operations you trace will still be linked to the original parent, not the one you just created.
To create nested observations manually, use the methods on the returned object (e.g., parentSpan.startObservation(...)
).
import { startObservation } from "@langfuse/tracing";
// Start a root span for a user request
const span = startObservation(
// name
"user-request",
// params
{
input: { query: "What is the capital of France?" },
}
);
// Create a nested span for a e.g. tool call
const toolCall = span.startObservation(
// name
"fetch-weather",
// params
{
input: { city: "Paris" },
},
// Specify observation type in asType
// This will type the attributes argument accordingly
// Default is 'span'
{ asType: "tool" }
);
// Simulate work and end the tool call span
await new Promise((resolve) => setTimeout(resolve, 100));
toolCall.update({ output: { temperature: "15°C" } }).end();
// Create a nested generation for the LLM call
const generation = span.startObservation(
"llm-call",
{
model: "gpt-4",
input: [{ role: "user", content: "What is the capital of France?" }],
},
{ asType: "generation" }
);
generation.update({
usageDetails: { input: 10, output: 5 },
output: { content: "The capital of France is Paris." },
});
generation.end();
// End the root span
span.update({ output: "Successfully answered user request." }).end();
If you use startObservation()
, you are responsible for calling .end()
on
the returned observation object. Failure to do so will result in incomplete or
missing observations in Langfuse.
Updating Traces
Often, you might not have all the information about a trace (like a userId
or sessionId
) when you start it. The SDK lets you add or update trace-level attributes at any point during its execution.
.updateTrace()
on an observation
When you create an observation manually with startObservation
, the returned object has an .updateTrace()
method. You can call this at any time before the root span ends to apply attributes to the entire trace.
import { startObservation } from "@langfuse/tracing";
// Start a trace without knowing the user yet
const rootSpan = startObservation("data-processing");
// ... some initial steps ...
// Later, once the user is authenticated, update the trace
const userId = "user-123";
const sessionId = "session-abc";
rootSpan.updateTrace({
userId: userId,
sessionId: sessionId,
tags: ["authenticated-user"],
metadata: { plan: "premium" },
});
// ... continue with the rest of the trace ...
const generation = rootSpan.startObservation(
"llm-call",
{},
{ asType: "generation" }
);
generation.end();
rootSpan.end();
updateActiveTrace()
When you’re inside a callback from startActiveObservation
, or a function wrapped with observe
, you might not have a direct reference to an observation object. In these cases, use the updateActiveTrace()
function. It automatically finds the currently active trace in the context and applies the new attributes.
import { startActiveObservation, updateActiveTrace } from "@langfuse/tracing";
await startActiveObservation("user-request", async (span) => {
// Initial part of the request
span.update({ input: { path: "/api/process" } });
// Simulate fetching user data
await new Promise((resolve) => setTimeout(resolve, 50));
const user = { id: "user-5678", name: "Jane Doe" };
// Update the active trace with the user's information
updateActiveTrace({
userId: user.id,
metadata: { userName: user.name },
});
// ... continue logic ...
span.update({ output: { status: "success" } }).end();
});