How to handle multi-message prompts in OpenAI Chat node?

Describe the problem/error/question

I want to send multiple messages to the OpenAI Chat node (LangChain → LLM Prompt Template) in a single execution, simulating a conversation. The first message is fixed, and the following ones are dynamic (from an array).

The goal is that the entire list of messages (e.g. 3–20 follow-ups) is processed together so that the LLM responds in the context of the full simulated chat, not one message at a time, and not as one long single message.

But I couldn’t find a way to do that. I did a lot of research and also asked ChatGPT but just couldn’t find an explanation how to do this. I’ve tried just putting all the prompts in one node, but that just puts all prompts together in one long prompt and doesn’t give me the result I am looking for.

How can I send multiple prompts to the OpenAI Chat node in sequence and get back intermediate replies, while keeping the context across all turns?

What is the error message (if any)?

I don’t really get an error message, but the result is not what I expect.

Please share your workflow

This is not the actual workflow I’m working on, since it is quite complex, but here is a simlified version of what I’m trying to do.

Share the output returned by the last node

[
  {
    "text": "Yes — Universal Paperclips is a well‑known browser incremental game. It was created by Frank Lantz (released 2017; hosted at decisionproblem.com/paperclips). The game is inspired by the \"paperclip maximizer\" thought experiment about misaligned AI.\n\nSimilar games (short list, with whether they are genuine idle/clicker games)\n\nPure idle/clicker classics\n- Cookie Clicker — yes (classic idle/incremental clicker with automation and prestige).\n- Adventure Capitalist — yes (idle capitalist-style game; money accumulates/automates while away).\n- Clicker Heroes — yes (idle RPG-style with automated DPS and prestige).\n- Idle Miner Tycoon — yes (mobile/idle resource management).\n- Realm Grinder — yes (incremental/clicker with lots of automation and faction mechanics).\n\nIncremental/idle-adjacent (more active strategy, management or narrative elements; many still allow offline progress but are less \"click-to-increase\" focused)\n- Kittens Game — mostly idle/incremental but heavier strategy/micro-management (commonly classified as an incremental/idle game).\n- Trimps — incremental with deeper mechanics and combat; can run semi-idle but is more active/complex.\n- Antimatter Dimensions — idle/incremental (prestige-heavy; often classed with clickers).\n- NGU Idle — idle/incremental but with many active minigames and layers.\n- Cookie Clicker–style hybrids like Candy Box / Candy Box 2 — more adventure/ASCII incremental; less pure idle.\n- A Dark Room — a minimalist incremental/story game; not a pure idle clicker (more active/resource-driven).\n\nQuick guidance\n- If by \"real idle clickers\" you mean games in which the primary gameplay loop is clicking to gain a resource that you can increasingly automate and that makes measurable progress while you're away, then Cookie Clicker, Adventure Capitalist, Clicker Heroes, Realm Grinder, Idle Miner Tycoon, Antimatter Dimensions, NGU Idle, and many mobile \"idle\" titles qualify.\n- If you want more recommendations tailored to whether you want narrative (like Universal Paperclips), deep strategy, or pure \"leave it running\" idle play, tell me which and I can list the best matches."
  }
]

Here is also the log:

{
  "messages": [
    "Human: Do you know the game Univeral Paperclips?\nHuman: Who is the creator?\nHuman: What are similar games?\nHuman: Which of those are real idle clickers?\nHuman: "
  ],
  "estimatedTokens": 40,
  "options": {
    "openai_api_key": {
      "lc": 1,
      "type": "secret",
      "id": [
        "OPENAI_API_KEY"
      ]
    },
    "model": "gpt-5-mini",
    "timeout": 1000000,
    "max_retries": 2,
    "configuration": {
      "baseURL": "https://api.openai.com/v1",
      "fetchOptions": {}
    },
    "model_kwargs": {}
  }
}

Information on your n8n setup

  • n8n version: 1.115.2
  • Database (default: SQLite): SQLite
  • n8n EXECUTIONS_PROCESS setting (default: own, main): own
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
  • Operating system: Linux Ubuntu

Hi @Heisenbug

You can try using:

{{ $json.followUps.join('\n') }}

That expression combines all follow-up messages from your array into one text block, with each message on a new line.

Joining the follow-ups into one message does technically work, but it’s not what I’m trying to achieve. I’m not looking to send one long prompt with multiple questions, but to simulate an actual multi-turn conversation, where each user message is followed by an AI response, and then the next message builds on that context.

In other words:
I want the LLM to process each message one after another, with intermediate replies, so that the conversation develops like a real chat, not just one big block of text.

Any idea how that could be done?

This is exactly what the AI Agent node can do..

I think you need to replace the Basic LLM Chain node with an AI Agent (with memory).
Also, don’t feed all the messages as a single array, try splitting them so each message is processed individually,
You’ll also need to handle the Session ID for memory management..

2 Likes

Thank you so much, the AI Agent is exactly what I was looking for :+1:

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.