TypeScript SDK - Setup
The Langfuse TypeScript SDK offers two setup approaches:
- Tracing for Langfuse Observability using OpenTelemetry
- Client for other Langfuse features like prompt management, evaluation, or accessing the Langfuse API
Tracing Setup
Installation
Install the relevant packages for a full tracing setup:
npm install @langfuse/tracing @langfuse/otel @opentelemetry/sdk-node
@langfuse/tracing
: Core tracing functions (startObservation
,startActiveObservation
, etc.)@langfuse/otel
: TheLangfuseSpanProcessor
to export traces to Langfuse.@opentelemetry/sdk-node
: The OpenTelemetry SDK for Node.js.
Learn more about the packages here.
Register your credentials
Add your Langfuse credentials to your environment variables. Make sure that you have a .env
file in your project root and a package like dotenv
to load the variables.
LANGFUSE_SECRET_KEY = "sk-lf-..."
LANGFUSE_PUBLIC_KEY = "pk-lf-..."
LANGFUSE_BASE_URL = "https://cloud.langfuse.com" # 🇪🇺 EU region
# LANGFUSE_BASE_URL = "https://us.cloud.langfuse.com" # 🇺🇸 US region
Initialize OpenTelemetry
The Langfuse TypeScript SDK’s tracing is built on top of OpenTelemetry, so you need to set up the OpenTelemetry SDK. The LangfuseSpanProcessor
is the key component that sends traces to Langfuse.
import { NodeSDK } from "@opentelemetry/sdk-node";
import { LangfuseSpanProcessor } from "@langfuse/otel";
const sdk = new NodeSDK({
spanProcessors: [new LangfuseSpanProcessor()],
});
sdk.start();
The LangfuseSpanProcessor
is the key component that sends traces to Langfuse.
For more options to configure the LangfuseSpanProcessor such as masking, filtering, and more, see the advanced usage.
You can learn more about setting up OpenTelemetry in your JS environment here.
Next.js users:
If you are using Next.js, please use the OpenTelemetry setup via the NodeSDK
described above rather than via registerOTel
from @vercel/otel
. This is because the @vercel/otel
package does not yet support the OpenTelemetry JS SDK v2 on which the @langfuse/tracing
and @langfuse/otel
packages are based.
See here for a full example for the Vercel AI SDK with NextJS on Vercel.
Client Setup
Installation
npm install @langfuse/client
Register your credentials
Add your Langfuse credentials to your environment variables. Make sure that you have a .env
file in your project root and a package like dotenv
to load the variables.
LANGFUSE_SECRET_KEY = "sk-lf-..."
LANGFUSE_PUBLIC_KEY = "pk-lf-..."
LANGFUSE_BASE_URL = "https://cloud.langfuse.com" # 🇪🇺 EU region
# LANGFUSE_BASE_URL = "https://us.cloud.langfuse.com" # 🇺🇸 US region
Initialize the client
Initialize the LangfuseClient
to interact with Langfuse. The client will automatically use the environment variables you set above.
import { LangfuseClient } from "@langfuse/client";
const langfuse = new LangfuseClient();
Alternative: Configure via constructor
You can also pass configuration options directly to the constructor:
import { LangfuseClient } from "@langfuse/client";
const langfuse = new LangfuseClient({
publicKey: "your-public-key",
secretKey: "your-secret-key",
baseUrl: "https://cloud.langfuse.com", // or your self-hosted instance
});