[Community Node] OpenAI + Langfuse integration for observability & agent tracing

Hi there! :waving_hand: We’re excited to share our custom n8n community node that seamlessly integrates OpenAI with Langfuse, adding powerful observability features :bullseye:. This node brings structured tracing and metadata tracking to your OpenAI-powered workflows with ease.

This project is proudly developed and maintained by Wistron DXLab.

Why Use This Integration?

This custom node is perfect for data-driven teams and observability enthusiasts looking to:

  1. Track and trace every request/response between n8n and OpenAI-compatible models.
  2. Inject contextual metadata (sessionId, userId, etc.) into your workflows.
  3. Visualize workflows and trace logs directly in your Langfuse dashboard.

Installation

For n8n v0.187+, you can install this node directly from the Community Nodes feature.

How to Install

  1. Navigate to Settings → Community Nodes in your n8n UI.
  2. Click Install.
  3. Enter n8n-nodes-openai-langfuse in the npm package field.
  4. Agree to the community node usage risks.
  5. Click Install to complete the process!

Credential Setup

To use this node, you’ll need credentials to:

  1. Authenticate with your OpenAI-compatible LLM endpoint.
  2. Enable Langfuse tracing by logging requests and responses to Langfuse.

:blue_book: OpenAI Credential Details

Field Name Description Example
OpenAI API Key Your secret API key to access the OpenAI endpoint. sk-xx
OpenAI Organization ID (Optional) Your OpenAI organization ID, if required. org-xyz789
OpenAI Base URL Base URL for your OpenAI-compatible endpoint. Default: https://api.openai.com/v1

:blue_book: Langfuse Credential Details

Field Name Description Example
Langfuse Base URL The base URL of your Langfuse instance (cloud or self-hosted). https://cloud.langfuse.com or custom URL.
Langfuse Public Key * Used for authentication in Langfuse. pk-xx
Langfuse Secret Key * Used for authentication in Langfuse. sk-xx

How to Find Your Langfuse Keys:
Log in to your Langfuse dashboard and navigate to:
Settings → Projects → [Your Project] to retrieve your publicKey and secretKey.

Once your credentials are correctly filled in, they will look like this:

After saving the credentials, you’re ready to configure your workflows and see your traces in Langfuse!


Using the Node

This node automatically injects Langfuse-compatible metadata into your OpenAI requests. You can trace every workflow run with context such as sessionId, userId, and any custom metadata.

Supported Metadata Fields

Field Type Description
sessionId string A unique identifier for your logical session.
userId string The ID of the end user making the request.
metadata object A custom JSON object for additional workflow context.

Example of Custom Metadata:

{
  "project": "test-project",
  "env": "dev",
  "workflow": "main-flow"
}

Example Workflow Setup

Here’s a typical workflow configuration using the Langfuse Chat Node:

1. Node Configuration

In the node UI, configure your inputs as follows:

Input Field Example Value
Session ID {{$json.sessionId}}
User ID user-123
Custom Metadata { "project": "example", "env": "prod" }


2. Workflow Design

An example workflow setup using this node:


3. Langfuse Trace Output

Once your workflow runs, you can view visualized traces in your Langfuse dashboard:


It helps you easily trace and analyze LLM usage inside n8n using Langfuse, including session metadata support.

Hope it helps! :rocket:

GitHub: GitHub - rorubyy/n8n-nodes-openai-langfuse: An n8n community node that brings Langfuse observability to your OpenAI chat workflows.

npm: n8n-nodes-openai-langfuse - npm

Langfuse maintainer here, thanks so much for sharing this community node! Let us know whenever you have any technical questions, happy to help.

For context, full discussion regarding tracing of n8n workflows: n8n Integration · langfuse · Discussion #4397 · GitHub

1 Like