Customize the "Question and Answer Chain" node "Question-Answering" prompt

The idea is:

In “Question and Answer Chain” node, add the ability to customize the Question-Answering — stuff default prompt.

When adding a retriever, such as the Vector Store Retriever, the following default prompt is used:

Use the following pieces of context to answer the users question. 
If you don't know the answer, just say that you don't know, don't try to make up an answer.
----------------
{context}

We should be able to customize the prompt.

My use case:

I would like to customize the prompt, so I can better control the output of the llm model.

I think it would be beneficial to add this because:

There are many advanced applications of a custom prompt. You could add further system instructions for safety reasons, customize the format of the response, add further instructions for the response…

Also if your application is exclusively non-english, there’s no reason to use an english prompt.

Any resources to support this?

There’s a medium article that shows how to customize the prompt:
https://medium.com/the-data-perspectives/custom-prompts-for-langchain-chains-a780b490c199#1fc4

Here’s where the default prompt is defined on the langchain source code:

Are you willing to work on this?

Sure :slight_smile:

I know this post haven’t seen any love in the past days, but I wan’t to point out that without being able to customize the Retriever prompt, the “Question and Answer Chain” becomes pretty much unusable in production, as you can’t add safety measures, can’t add guidelines for answers, so you basically can’t control the output, which is a must for any type of llm application.

It is a pity because of how it would be easy and helpful to use the the “Question and Answer Chain”, but as it currently stands, it only serves as a proof of concept tool.

I’ve created a PR to implement this feature.

1 Like

New version [email protected] got released which includes the GitHub PR 10385.

1 Like

I’m using docker; n8n version 1.71.2

What about for the AI Agent →
{66E0E52B-09AA-4412-A21C-E084D50E4B3F}

the system message is hardcoded? Can we alter that?
System: Use the following pieces of context to answer the users question. \nIf you don’t know the answer, just say that you don’t know, don’t try to make up an answer.\n----------------\nBoris is not hanging out with Nibras on 2024-12-13\nHuman: information related to Nibras

{
  "meta": {
    "instanceId": "ccbb744697d57316fd1144f905b9425141326be21058c1c207098c28db4db80f"
  },
  "nodes": [
    {
      "parameters": {
        "options": {}
      },
      "id": "da832fd5-d756-4344-8fe6-536f79b3ea7e",
      "name": "Embeddings OpenAI3",
      "type": "@n8n/n8n-nodes-langchain.embeddingsOpenAi",
      "typeVersion": 1.1,
      "position": [
        4700,
        260
      ],
      "credentials": {
        "openAiApi": {
          "id": "SaCJbJV57Fb2RwC0",
          "name": "OpenAi account"
        }
      }
    },
    {
      "parameters": {
        "name": "find_memories",
        "description": "provide relevant memory information",
        "topK": 1
      },
      "id": "7c0a3e25-0bcb-42d4-986e-b919743cdca0",
      "name": "Find Memories1",
      "type": "@n8n/n8n-nodes-langchain.toolVectorStore",
      "typeVersion": 1,
      "position": [
        4820,
        -40
      ]
    },
    {
      "parameters": {
        "options": {}
      },
      "id": "7c2c55c8-c774-475e-a8c8-05fb80f35c9a",
      "name": "OpenAI Chat Model2",
      "type": "@n8n/n8n-nodes-langchain.lmChatOpenAi",
      "typeVersion": 1,
      "position": [
        4920,
        180
      ],
      "credentials": {
        "openAiApi": {
          "id": "SaCJbJV57Fb2RwC0",
          "name": "OpenAi account"
        }
      }
    },
    {
      "parameters": {
        "tableName": {
          "__rl": true,
          "value": "documents",
          "mode": "list",
          "cachedResultName": "documents"
        },
        "options": {
          "queryName": "match_documents",
          "metadata": {
            "metadataValues": [
              {
                "name": "type",
                "value": "memory"
              }
            ]
          }
        }
      },
      "id": "7d04271e-109a-4ade-a97e-8585f1b2e500",
      "name": "Find Memories",
      "type": "@n8n/n8n-nodes-langchain.vectorStoreSupabase",
      "typeVersion": 1,
      "position": [
        4700,
        140
      ],
      "credentials": {
        "supabaseApi": {
          "id": "tpEHjpWf0vcrlirX",
          "name": "Supabase account"
        }
      }
    }
  ],
  "connections": {
    "Embeddings OpenAI3": {
      "ai_embedding": [
        [
          {
            "node": "Find Memories",
            "type": "ai_embedding",
            "index": 0
          }
        ]
      ]
    },
    "OpenAI Chat Model2": {
      "ai_languageModel": [
        [
          {
            "node": "Find Memories1",
            "type": "ai_languageModel",
            "index": 0
          }
        ]
      ]
    },
    "Find Memories": {
      "ai_vectorStore": [
        [
          {
            "node": "Find Memories1",
            "type": "ai_vectorStore",
            "index": 0
          }
        ]
      ]
    }
  },
  "pinData": {}
}```